A Year of Hyperspectral Image Processing in the Cloud – HICO IPS reaches a milestone

It has been a little more than one year since we first launched the HICO Image Processing System (HICO IPS), and its performance continues to be exceptional. In fact, as a prototype, HICO IPS has exceeded all expectations, working flawlessly since it was first launched in May 2015.

HICO IPS

HICO IPS is a web-application for on-demand remote sensing image analysis that allows users to interactively select images and algorithms, dynamically launch analysis routines, and then see results displayed directly in an online map interface. More details are as follows:

  • System developed to demonstrate capabilities for remote sensing image analysis in the cloud
  • Software stack utilizes a combination of commercial and open-source software
  • Core image processing and data management performed using ENVI Services Engine
  • Operational system hosted on Rackspace cloud server
  • Utilizes imagery from the Hyperspectral Imager for the Coastal Ocean (HICO), which was deployed on the International Space Station (ISS) from 2009-2014
  • Example algorithms are included for assessing coastal water quality and other nearshore environmental conditions
  • Application developed in collaboration between HySpeed Computing and Exelis Visual Information Solutions (now Harris Geospatial Solutions)
  • Project supported by the Center for the Advancement of Science in Space (CASIS)

And here’s a short overview of HICO IPS accomplishments and performance in the past year, including some infographics to help illustrate how the system has been utilized:

  • The application has received over 5000 visitors
  • Users represent over 100 different countries
  • System has processed a total of 1000 images
  • Equivalent area processed is 4.5 million square kilometers
  • The most popular scene selected for analysis was the Yellow River
  • The most popular algorithm was Chlorophyll followed closely by Land Mask
  • Application has run continuously without interruption since launch in May 2015

HICO IPS infographics

Try it out today for yourself: http://hyspeedgeo.com/HICO/

Advertisements

What’s New in ENVI 5.3

As the geospatial industry continues to evolve, so too does the software. Here’s a look at what’s new in ENVI 5.3, the latest release of the popular image analysis software from Exelis VIS.

ENVI

  • New data formats and sensors. ENVI 5.3 now provides support to read and display imagery from Deimos-2, DubaiSat-2, Pleiades-HR and Spot mosaic tiles, GeoPackage vectors, Google-formatted SkySat-2, and Sentinel-2.
  • Spectral indices. In addition to the numerous indices already included in ENVI (more than 60), new options include the Normalized Difference Mud Index (NDMI) and Modified Normalized Difference Water Index (MNDWI).
  • Atmospheric correction. The Quick Atmospheric Correction (QUAC) algorithm has been updated with the latest enhancements from Spectral Sciences, Inc. to help improve algorithm accuracy.
  • Digital elevation model. Users can now download the GMTED2010 DEM (7.5 arc seconds resolution) from the Exelis VIS website for use in improving the accuracy of Image Registration using RPC Orthorectification and Auto Tie Point Generation.
  • Point clouds. If you subscribe to the ENVI Photogrammetry Module (separate license from ENVI), then the Generate Point Clouds by Dense Image Matching tool is now available for generating 3D point clouds from GeoEye-1, IKONOS, Pleiades-1A, QuickBird, Spot-6, WorldView-1,-2 and -3, and the Digital Point Positioning Data Base (DPPDB).
  • LiDAR. The ENVI LiDAR module has been merged with ENVI and can now be launched directly from within the ENVI interface.
  • Geospatial PDF. Your views, including all currently displayed imagery, layers and annotations in those views, can now be exported directly to geospatial PDF files.
  • Spatial subset. When selecting files to add to the workspace, the File Selection tool now includes options to subset files by raster, vector, region of interest or map coordinates.
  • Regrid raster. Users can now regrid raster files to custom defined grids (geographic projection, pixel size, spatial extent and/or number of rows and columns).
  • Programming. The latest ENVI release also includes dozens of new tasks, too numerous to list here, that can be utilized for developing custom user applications in ENVI and ENVI Services Engine.

To learn more about the above features and improvements, as well as many more, read the latest release notes or check out the ENVI help documentation.

ENVI 5.3

HICO Image Gallery – Looking beyond the data

This slideshow requires JavaScript.

What’s in an image? Beyond the visual impact, beyond the pixels, and beyond the data, there’s valuable information to be had. It just takes the right tools to extract that information.

With that thought in mind, HySpeed Computing created the HICO Image Processing System to make these tools readily available and thereby put image processing capabilities directly in your hands.

The HICO IPS is a prototype web application for on-demand remote sensing image analysis in the cloud. It’s available through your browser, so it doesn’t require any specialized software, and you don’t have to be a remote sensing expert to use the system.

HICO, the Hyperspectral Imager for the Coastal Ocean, operating on the International Space Station from 2009-2014, is the first space-based imaging spectrometer designed specifically to measure the coastal environment. And research shows that substantial amounts of information can be derived from this imagery.

To commemorate the recent launch of the HICO IPS and celebrate the beauty of our coastal environment, we’ve put together a gallery highlighting some of the stunning images acquired by HICO that are available in the system.

We hope you enjoy the images, and encourage you to explore the HICO IPS web application to try out your own remote sensing analysis.

HICO IPS: Chesapeake Bay Chla

To access the HICO Image Processing System: http://hyspeedgeo.com/HICO/

For more information on HICO: http://hico.coas.oregonstate.edu/

Geospatial Solutions in the Cloud

Source: Exelis VIS whitepaper – 12/2/2014 (reprinted with permission)

What are Geospatial Analytics?

Geospatial analytics allow people to ask questions of data that exist within a spatial context. Usually this means extracting information from remotely sensed data such as multispectral imagery or LiDAR that is focused on observing the Earth and the things happening on it, both in a static sense or over a period of time. Familiar examples of this type of geospatial analysis include Land Classification, Change Detection, Soil and Vegetative indexes, and depending on the bands of your data, Target Detection and Material Identification. However, geospatial analytics can also mean analyzing data that is not optical in nature.

So what other types of problems can geospatial analytics solve? Geospatial analytics comprise of more than just images laid over a representation of the Earth. Geospatial analytics can ask questions of ANY type of geospatial data, and provide insight into static and changing conditions within a multi-dimensional space. Things like aircraft vectors in space and time, wind speeds, or ocean currents can be introduced into geospatial algorithms to provide more context to a problem and to enable new correlations to be made between variables.

Many times, advanced analytics like these can benefit from the power of cloud, or server-based computing. Benefits from the implementation of cloud-based geospatial analytics include the ability to serve on-demand analytic requests from connected devices, run complex algorithms on large datasets, or perform continuous analysis on a series of changing variables. Cloud analytics also improve the ability to conduct multi-modal analysis, or processes that take into account many different types of geospatial information.

Here we can see vectors of a UAV along with the ground footprint of the sensor overlaid in Google Earth™, as well as a custom interface built on ENVI that allows users to visualize real-time weather data in four dimensions (Figure 1).

geospatial_cloud_fig1

Figure 1 – Multi-Modal Geospatial Analysis – data courtesy NOAA

These are just a few examples of non-traditional geospatial analytics that cloud-based architecture is very good at solving.

Cloud-Based Geospatial Analysis Models 

So let’s take a quick look at how cloud-based analytics work. There are two different operational models for running analytics, the on-demand model and the batch process model. In a on-demand model (Figure 2), a user generally requests a specific piece of information from a web-enabled device such as a computer, a tablet, or a smart phone. Here the user is making the request to a cloud based resource.

geospatial_cloud_fig2

Figure 2 – On-Demand Analysis Model

Next, the server identifies the requested data and runs the selected analysis on it. This leverages scalable server architecture that can vastly decrease the amount of time it takes to run the analysis and eliminate the need to host the data or the software on the web-enabled device. Finally, the requested information is sent back to the user, usually at a fraction of the bandwidth cost required to move large amounts of data or full resolution derived products through the internet.

In the automated batch process analysis model (Figure 3), the cloud is designed to conduct prescribed analysis to data as it becomes available to the system, reducing the amount of manual interaction and time that it takes to prepare or analyze data. This system can take in huge volumes of data from various sources such as aerial or satellite images, vector data, full motion video, radar, or other data types, and then run a set of pre-determined analyses on that data depending on the data type and the requested information.

geospatial_cloud_fig3

Figure 3 – Automated Batch Process Model

Once the data has been pre-processed, it is ready for consumption, and the information is pushed out to either another cloud based asset, such as an individual user that needs to request information or monitor assets in real-time, or simply placed into a database in a ‘ready-state’ to be accessed and analyzed later.

The ability for this type of system to leverage the computing power of scalable server stacks enables the processing of huge amounts of data and greatly reduces the time and resources needed to get raw data into a consumable state.

Solutions in the Cloud

HySpeed Computing

Now let’s take a look at a couple of use cases that employ ENVI capabilities in the cloud. The first is a web-based interface that allows users to perform on-demand geospatial analytics on hyperspectral data supplied by HICO™, the Hyperspectral Imager for the Coastal Ocean (Figure 4). HICO is a hyperspectral imaging spectrometer that is attached to the International Space Station (ISS) and is designed specifically for sampling the coastal ocean in an effort to further our understanding of the world’s coastal regions.

geospatial_cloud_fig4

Figure 4 – The HICO Sensor – image courtesy of NASA

Developed by HySpeed Computing, the prototype HICO Image Processing System (Figure 5) allows users to conduct on-demand image analysis of HICO’s imagery from a web-based browser through the use of ENVI cloud capabilities.

geospatial_cloud_fig5

Figure 5 – The HICO Image Processing System – data courtesy of NASA

The interface exposes several custom ENVI tasks designed specifically to take advantage of the unique spectral resolution of the HICO sensor to extract information characterizing the coastal environment. This type of interface is a good example of the on-demand scenario presented earlier, as it allows users to conduct on-demand analysis in the cloud without the need to have direct access to the data or the computing power to run the hyperspectral algorithms.

The goal of this system is to provide ubiquitous access to the robust HICO catalog of hyperspectral data as well as the ENVI algorithms needed to analyze them. This will allow researchers and other analysts the ability to conduct valuable coastal research using web-based interfaces while capitalizing on the efforts of the Office of Naval Research, NASA, and Oregon State University that went into the development, deployment, and operation of HICO.

Milcord

Another use case involves a real-time analysis scenario that comes from a company called Milcord and their dPlan Next Generation Mission Manager (Figure 6). The goal of dPlan is to “aid mission managers by employing an intelligent, real-time decision engine for multi-vehicle operations and re-planning tasks” [1]. What this means is that dPlan helps folks make UAV flight plans based upon a number of different dynamic factors, and delivers the best plan for multiple assets both before and during the actual operation.

geospatial_cloud_fig6

Figure 6 – The dPlan Next Generation Mission Manager

Factors that are used to help score the flight plans include fuel availability, schedule metrics based upon priorities for each target, as well as what are known as National Image Interpretability Rating Scales, or NIIRS (Figure 7). NIIRS are used “to define and measure the quality of images and performance of imaging systems. Through a process referred to as “rating” an image, the NIIRS is used by imagery analysts to assign a number which indicates the interpretability of a given image.” [2]

geospatial_cloud_fig7

Figure 7 – Extent of NIIRS 1-9 Grids Centered in an Area Near Calgary

These factors are combined into a cost function, and dPlan uses the cost function to find the optimal flight plan for multiple assets over a multitude of targets. dPlan also performs a cost-benefit analysis to indicate if the asset cannot reach all targets, and which target might be the lowest cost to remove from the plan, or whether another asset can visit the target instead.

dPlan employs a custom ESE application to generate huge grids of Line of Sight values and NIIRs values associated with a given asset and target (Figure 8). dPlan uses this grid of points to generate route geometry, for example, how close and at what angle does the asset need to approach the target.

geospatial_cloud_fig8

Figure 8 – dPlan NIIRS Workflow

The cloud-computing power leveraged by dPlan allows users to re-evaluate flight plans on the fly, taking into account new information as it becomes available in real time. dPlan is a great example of how cloud-based computing combined with powerful analysis algorithms can solve complex problems in real time and reduce the resources needed to make accurate decisions amidst changing environments.

Solutions in the Cloud

So what do we do here at Exelis to enable folks like HySpeed Computing and Milcord to ask these kinds of questions from their data and retrieve reliable answers? The technology they’re using is called the ENVI Services Engine (Figure 9), an enterprise-ready version of the ENVI image analytics stack. We currently have over 60 out-of-the-box analysis tasks built into it, and are creating more with every release.

geospatial_cloud_fig9

Figure 9 – The ENVI Services Engine

The real value here is that ENVI Services Engine allows users to develop their own analysis tasks and expose them through the engine. This is what enables users to develop unique solutions to geospatial problems and share them as repeatable processes for others to use. These solutions can be run over and over again on different data and provide consistent dependable information to the persons requesting the analysis. The cloud based technology makes it easy to access from web-enabled devices while leveraging the enormous computing power of scalable server instances. This combination of customizable geospatial analysis tasks and virtually limitless computing power begins to address some of the limiting factors of analyzing what is known as big data, or datasets so large and complex that traditional computing practices are not sufficient to identify correlations within disconnected data streams.

Our goal here at Exelis is to enable you to develop custom solutions to industry-specific geospatial problems using interoperable, off-the-shelf technology. For more information on what we can do for you and your organization, please feel free to contact us.

Sources:

[1] 2014. Milcord. “Geospatial Analytics in the Cloud: Successful Application Scenarios” webinar. https://event.webcasts.com/starthere.jsp?ei=1042556

[2] 2014. The Federation of American Scientists. “National Image Interpretability Rating Scales”. http://fas.org/irp/imint/niirs.htm

 

A Look at What’s New in ENVI 5.2

Earlier this month Exelis Visual Information Solutions released ENVI 5.2, the latest version of their popular geospatial analysis software.

ENVI 5.2

ENVI 5.2 includes a number of new image processing tools as well as various updates and improvements to current capabilities. We’ve already downloaded our copy and started working with the new features. Here’s a look at what’s included.

A few of the most exciting new additions to ENVI include the Spatiotemporal Analysis tools, Spectral Indices tool, Full Motion Video player, and improved integration with ArcGIS:

  • Spatiotemporal Analysis. Just like the name sounds, this feature provides users the ability to analyze stacks of imagery through space and time. Most notably, tools are now available to build a raster series, where images are ordered sequentially by time, to reproject images from multiple sensors into a common projection and grid size, and to animate and export videos of these raster series.
  • Spectral Indices. Expanding on the capabilities of the previous Vegetation Index Calculator, the new Spectral Indices tool includes 64 different indices, which in addition to analyzing vegetation can also be used to investigate geology, man-made features, burned areas and water. The tool conveniently selects only those indices that can be calculated for a given input image dependent on its spectral characteristics. So when you launch the tool you’ll only see those indices that can be calculated using your imagery.
  • Full Motion Video. ENVI 5.2 now supports video, allowing users to not just play video, but also convert video files to time-enabled raster series and extract individual video frames for analysis using standard ENVI tools. Supported file formats include Skybox SkySat video, Adobe Flash Video and Shockwave Flash, Animated GIF, Apple Quicktime, Audio Video Interleaved, Google WebM Matroska, Matroska Video, Motion JPEG and JPEG2000, MPEG-1 Part 2, MPEG-2 Transport Stream, MPEG-2 Part 2, MPEG-4 Part 12 and MPEG-4 Part 14.
  • Integration with ArcGIS. Originally introduced in ENVI 5.0, additional functionality has been added for ENVI to seamlessly interact with ArcGIS, including the ability to integrate analysis tools and image output layers in a concurrent session of ArcMap. For those working in both software domains, this helps simplify your geospatial workflows and more closely integrate your raster and vector analyses.

Other noteworthy additions in this ENVI release include:

  • New data types. ENVI 5.2 now provides support to read and display imagery from AlSat-2A, Deimos-1, Gaofen-1, Proba-V S10, Proba-V S1, SkySat-1, WorldView-3, Ziyuan-1-02C and Ziyuan-3A, as well as data formats GRIB-1, GRIB-2, Multi-page TIFF and NetCDF-4.
  • NNDiffuse Pan Sharpening. A new pan sharpening tool based on nearest neighbor diffusion has been added, which is multi-threaded for high-performance image processing.
  • Scatter Plot Tool. The previous scatter plot tool has been updated and modernized, allowing users to dynamically switch bands, calculate spectral statistics, interact with ROIs, and generate density slices of the displayed spectral data.
  • Raster Color Slice. This useful tool has also been updated, particularly from a performance perspective, providing dynamic updates in the image display according to parameter changes made in the tool.

For those interested in implementing ENVI in the cloud, the ENVI 5.2 release also marks the release of ENVI Services Engine 5.2 , which is an enterprise version of ENVI that facilitates on-demand, scalable, web-based image processing applications. As an example, HySpeed Computing is currently developing a prototype implementation of ESE for processing hyperspectral imagery from the HICO sensor on the International Space Station. The HICO Image Processing System will soon be publically available for testing and evaluation by the community. A link to access the system will be provided on our website once it is released.

HICO IPS

To learn about the above features, and many more not listed here, see the video from Exelis VIS and/or read the latest release notes on ENVI 5.2.

We’re excited to put the new tools to work. How about you?

NASA Takes Over Navy Instrument On ISS

A version of this article appears in the May 19 edition of Aviation Week & Space Technology, p. 59, Frank Morring, Jr.

HREP on JEMEFA hyperspectral imager on the International Space Station (ISS) that was developed by the U.S. Navy as an experiment in littoral-warfare support is finding new life as an academic tool under NASA management, and already has drawn some seed money as a pathfinder for commercial Earth observation.

Facing Earth in open space on the Japanese Experiment Module’s porchlike Exposed Facility, the Hyperspectral Imager for Coastal Oceans (HICO) continues to return at least one image a day of near-shore waters with unprecedented spectral and spatial resolution.

HICO was built to provide a low-cost means to study the utility of hyperspectral imaging from orbit in meeting the Navy’s operational needs close to shore. Growing out of its experiences in the Persian Gulf and other shallow-water operations, the Office of Naval Research wanted to evaluate the utility of space-based hyperspectral imagery to characterize littoral waters and conduct bathymetry to track changes over time that could impact operations.

The Naval Research Laboratory (NRL) developed HICO, which was based on airborne hyperspectral imagery technology and off-the-shelf hardware to hold down costs. HICO was launched Sept. 10, 2009, on a Japanese H-2 transfer vehicle as part of the HICO and RAIDS (Remote Atmospheric and Ionospheric Detection System) Experimental Payloads; it returned its first image two weeks later.

In three years of Navy-funded operations, HICO “exceeded all its goals,” says Mary Kappus, coastal and ocean remote sensing branch head at NRL.

“In the past it was blue ocean stuff, and things have moved more toward interest in the coastal ocean,” she says. “It is a much more difficult environment. In the open ocean, multi-spectral was at least adequate.”

NASA, the U.S. partner on the ISS, took over HICO in January 2013 after Navy funding expired. The Navy also released almost all of the HICO data collected during its three years running the instrument. It has been posted for open access on the HICO website managed by Oregon State University.

While the Navy program was open to most researchers, the principal-investigator approach and the service’s multistep approval process made it laborious to gain access on the HICO instrument.

“[NASA] wanted it opened up, and we had to get permission from the Navy to put the historical data on there,” says Kappus. “So anything we collect now goes on there, and then we ask the Navy for permission to put old data on there. They reviewed [this] and approved releasing most of it.”

Under the new regime NRL still operates the HICO sensor, but through the NASA ISS payload office at Marshall Space Flight Center. This more-direct approach has given users access to more data and, depending on the target’s position relative to the station orbit, a chance to collect two images per day instead of one. Kappus explains that the data buffer on HICO is relatively small, so coordination with the downlink via the Payload Operations Center at Marshall is essential to collecting data before the buffer fills up.

Task orders are worked through the same channels. Presenting an update to HICO users in Silver Spring, Md., on May 7, Kappus said 171 of 332 total “scenes” targeted between Nov. 11, 2013, and March 12 were requested by researchers backed by the NRL and NASA; international researchers comprised the balance.

Data from HICO is posted on NASA’s Ocean Color website, where usage also is tracked. After the U.S., “China is the biggest user” of the website data, Kappus says, followed by Germany, Japan and Russia. The types of data sought, such as seasonal bathymetry that shows changes in the bottom of shallow waters, has remained the same through the transition from Navy to NASA.

“The same kinds of things are relevant for everybody; what is changing in the water,” she says.

HICO offers unprecedented detail from its perch on the ISS, providing 90-meter (295-ft.) resolution across wavelengths of 380-960 nanometers sampled at 5.7 nanometers. Sorting that rich dataset requires sophisticated software, typically custom-made and out of the reach of many users.

To expand the user set for HICO and future Earth-observing sensors on the space station, the Center for the Advancement of Science in Space, the non-profit set up by NASA to promote the commercial use of U.S. National Laboratory facilities on the ISS, awarded a $150,000 grant to HySpeed Computing, a Miami-based startup, and [Exelis] to demonstrate an online imaging processing system that can rapidly integrate new algorithms.

James Goodman, president/CEO of HySpeed, says the idea is to build a commercial way for users to process HICO data for their own needs at the same place online that they get it.

“Ideally a copy of this will [be] on the Oregon State server where the data resides,” Goodman says. “As a HICO user you would come in and say ‘I want to use this data, and I want to run this process.’ So you don’t need your own customized remote-sensing software. It expands it well beyond the research crowd that has invested in high-end remote-sensing software. It can be any-level user who has a web browser.”