The Panama Canal from Space – A collection of satellite images before and after the Expansion Project

In commemoration of completion of the Panama Canal Expansion Project, and in tribute to the upcoming official opening on June 26, we present a series of before and after satellite photos highlighting the Expansion Project and showcasing this engineering marvel.

Work on the Panama Canal Expansion took nearly 9 years to complete, starting in September 2007, at a cost of US$5.2 billion. By adding a third set of locks on both the Pacific and Caribbean sides of the canal, dredging the existing navigation channel, adding a new approach channel on the Caribbean side and a new 6.1 km access channel on the Pacific side, and raising the Gatun Lake maximum operating level, the Expansion doubles the capacity of the Canal and significantly increases the size of vessels that can transit the Canal.

Below we present before and after satellite images of the newly expanded Canal, provide an overview of the Expansion Project, show a rare nearly-cloudless image from Landsat-5, and even include one of the earliest Landsat images of the Panama Canal acquired by Landsat-1 on March 18, 1973.

Panama Canal Expansion - Pacific

Satellite views of the Pacific Ocean entrance to the Panama Canal, before (left; Landsat-7 on November 20, 2002) and after (right; Landsat-8 on June 11, 2016) the Expansion Project. Note the addition of the third set of locks, the three sets of water reutilization basins immediately adjacent to the new locks, and the new access channel that now bypasses Miraflores Lake.

 

Panama Canal Expansion - Caribbean

Satellite views of the Caribbean Sea entrance to the Panama Canal, before (left; Landsat-7 on May 28, 2002) and after (right; Landsat-8 on February 20, 2016) the Expansion Project. Note the addition of the third set of locks, the three sets of water reutilization basins immediately adjacent to the new locks, and the new approach channel.

 

Overview - Panama Canal Expansion Project

An overview of the Panama Canal Expansion Project (from: http://micanaldepanama.com/expansion/).

 

Panama Canal - Landsat5 Cloud Free

This is a rare nearly-cloudless glimpse of the entire Panama Canal acquired by Landsat-5 on March 27, 2000.

 

Panama Canal - Landsat1

For the remote sensing history aficionados, this is the earliest Landsat image of the Panama Canal listed in the USGS archives, from Landsat-1 on March 18, 1973, over 40 years ago.

Astronaut Photography – Your access to stunning views from space

Astronauts have busy schedules in space – system operations, maintenance, repairs, science experiments – but did you know they also acquire hundreds of photos during each mission?

Reid Wiseman , Astronaut Photography

From stunning views of Earth’s natural features to glimpses of your favorite city at night, and from pure artistry to applied science, these photos offer a remarkable perspective of our planet’s surface as well as a valuable historical record of how and where our planet is changing.

There are now two great resources available for viewing this photography:

Both websites provide access to thousands of photos, are free to use, allow users to search photos or browse by category, and even provide options to download images for your own use (but be sure to read through the conditions of use on both websites).

We’ve spent countless hours browsing through these stunning image collections, and encourage you to take a look for yourself.

We hope you enjoy!

Gateway to Astronaut Photography of Earth

“The Gateway to Astronaut Photography of Earth hosts the best and most complete online collection of astronaut photographs of the Earth from 1961 through the present. This service is provided by the International Space Station program and the JSC Earth Science & Remote Sensing Unit, ARES Division, Exploration Integration Science Directorate.” – http://eol.jsc.nasa.gov/

Windows on Earth

“Windows on Earth is an educational project that features photographs taken by astronauts on the International Space Station.  Astronauts take hundreds of photos each day, for science research, education and public outreach.  The photos are often dramatic, and help us all appreciate home planet Earth. The site is operated by TERC, an educational non-profit, in collaboration with the Association of Space Explorers (the professional association of flown astronauts and cosmonauts), the Virtual High School, and CASIS (Center for Advancement of Science in Space).” – http://www.windowsonearth.org/

Windows on Earth featured

HICO Image Gallery – Looking beyond the data

This slideshow requires JavaScript.

What’s in an image? Beyond the visual impact, beyond the pixels, and beyond the data, there’s valuable information to be had. It just takes the right tools to extract that information.

With that thought in mind, HySpeed Computing created the HICO Image Processing System to make these tools readily available and thereby put image processing capabilities directly in your hands.

The HICO IPS is a prototype web application for on-demand remote sensing image analysis in the cloud. It’s available through your browser, so it doesn’t require any specialized software, and you don’t have to be a remote sensing expert to use the system.

HICO, the Hyperspectral Imager for the Coastal Ocean, operating on the International Space Station from 2009-2014, is the first space-based imaging spectrometer designed specifically to measure the coastal environment. And research shows that substantial amounts of information can be derived from this imagery.

To commemorate the recent launch of the HICO IPS and celebrate the beauty of our coastal environment, we’ve put together a gallery highlighting some of the stunning images acquired by HICO that are available in the system.

We hope you enjoy the images, and encourage you to explore the HICO IPS web application to try out your own remote sensing analysis.

HICO IPS: Chesapeake Bay Chla

To access the HICO Image Processing System: http://hyspeedgeo.com/HICO/

For more information on HICO: http://hico.coas.oregonstate.edu/

Visualizing HICO Ground Tracks Using Google Earth – A useful tool for project planning

Do you work with HICO imagery? Are you planning a project using HICO? Or perhaps you’re just interested in exploring where HICO will be acquiring imagery in the coming days?

If so, be sure to check out the ISS Orbit tool on the HICO website at Oregon State University. This tool allows you to interactively visualize the location of HICO ground track locations using Google Earth.

HICO ISS Orbit tool

The tool shows predicted HICO ground tracks in selected 1- or 3-day intervals up to six months in the future. However, even though orbital files are updated regularly, because of uncertainties in future ISS orbit specifics, the prediction is most accurate 2-3 days into the future and declines thereafter. So be cautious when planning fieldwork or image acquisitions for any extended time period.

For more information on ISS orbits characteristics, visit the NASA Space Station Orbit tutorial.

The ground tracks are displayed only for local daylight hours, and illustrate the nominal ground track (shown in teal above) as well as the full width available using HICO’s pointing capabilities (shown in grey above). Users have the option of also displaying the place names and locations of scheduled target areas for both ascending and descending orbits. Additionally, as the zoom level is increased, yellow dots appear in the visualization indicating the predicted time and date the ISS will pass over that location.

The HICO ISS Orbit tool requires the Google Earth plugin, which is available in Chrome, Firefox and IE (note that IE users may need to add the oregonstate.edu website to Compatibility View in the tool settings).

Let’s look at an example. Say you’re interested in exploring when HICO will be available to acquire imagery of Melbourne Harbor from April 5-11. Using the tool to step through the ISS orbits for those dates, it is revealed that Melbourne Harbor can be acquired on April 5 @ 22:26 and 5:45 GMT, April 6 @ 4:56 GMT and April 9 @ 4:05.

HICO Melbourne Harbor 040514

ISS Orbit tool: HICO – Melbourne Harbor 5-April-2014

HICO Melbourne Harbor 040614

ISS Orbit tool: HICO – Melbourne Harbor 6-April-2014

HICO Melbourne Harbor 040914

ISS Orbit tool: HICO – Melbourne Harbor 9-April-2014

Now let’s extend this example to see if Hyperion data is also available for Melbourne Harbor for the same dates. To do so, you will need to utilize COVE, a similar tool (best in Chrome or Firefox) with robust capabilities for visualizing ground tracks of numerous Earth observing satellites (but unfortunately not HICO or any other instruments on the ISS). Visit our earlier post for an overview of COVE’s capabilities.

Using COVE, it can be seen that Hyperion data is available for acquisition of Melbourne Harbor on April 9 @ 23:16 GMT. This closely coincident acquisition opportunity might provide some interesting data for comparing hyperspectral analysis techniques using HICO and Hyperion.

Hyperion Melbourne Harbor 040914

COVE tool: Hyperion – Melbourne Harbor 5-April-2014

So be sure to check out both the COVE and HICO ISS Orbit tools when planning your next mission.

HICO ISS Orbit tool: http://hico.coas.oregonstate.edu/orbit/orbit.php

COVE: http://www.ceos-cove.org/

About HICO (http://hico.coas.oregonstate.edu/): “The Hyperspectral Imager for the Coastal Ocean (HICO™) is an imaging spectrometer based on the PHILLS airborne imaging spectrometers. HICO is the first spaceborne imaging spectrometer designed to sample the coastal ocean. HICO samples selected coastal regions at 90 m with full spectral coverage (380 to 960 nm sampled at 5.7 nm) and a very high signal-to-noise ratio to resolve the complexity of the coastal ocean. HICO demonstrates coastal products including water clarity, bottom types, bathymetry and on-shore vegetation maps. Each year HICO collects approximately 2000 scenes from around the world. The current focus is on providing HICO data for scientific research on coastal zones and other regions around the world. To that end we have developed this website and we will make data available to registered HICO Data Users who wish to work with us as a team to exploit these data.”

About Hyperion (http://eo1.gsfc.nasa.gov/ and http://eo1.usgs.gov/): “The Hyperion instrument provides a new class of Earth observation data for improved Earth surface characterization. The Hyperion provides a science grade instrument with quality calibration based on heritage from the LEWIS Hyperspectral Imaging Instrument (HSI). The Hyperion capabilities provide resolution of surface properties into hundreds of spectral bands versus the ten multispectral bands flown on traditional Landsat imaging missions. Through these spectral bands, complex land eco-systems can be imaged and accurately classified.The Hyperion provides a high resolution hyperspectral imager capable of resolving 220 spectral bands [from 400 to 2500 nm] with a 30-meter resolution. The instrument can image a 7.5 km by 100 km land area per image, and provide detailed spectral mapping across all 220 channels with high radiometric accuracy.”

Important Update for Landsat 8 – Reprocessing of entire archive scheduled to begin Feb 3

Starting today, Feb 3, 2014, the USGS will remove all Landsat 8 scenes from the online cache and reprocess the entire archive using updated calibration parameters.

Landsat 8

Artist’s rendition of Landsat 8 (NASA/GSFC Conceptual Image Lab)

The reprocessing will begin with the most recent acquisitions and then progress backwards to the beginning of the Landsat 8 mission. The entire reprocessing exercise is expected to take 50 days; however, scenes will also be available for reprocessing through on-demand product orders.

Rest assured, both the Operational Land Imager (OLI) and Thermal Infrared Sensor (TIRS) on Landsat 8 are operating correctly and producing quality measurements. The reprocessing is being undertaken to implement improvements in the calibration parameters, taking advantage of “radiometric and geometric refinements” that have been identified since launch, to “ensure good calibration and data continuity.”

The USGS states that “Most users will not need to reorder data currently in their local archive; however, users are encouraged to review all Landsat 8 calibration notices and evaluate the improvements as they relate to specific applications.”

“These corrections include all calibration parameter file updates since launch; improved OLI reflectance conversion coefficients for the cirrus band; improved OLI radiance conversion coefficients for all bands; refined OLI detector linearization to decrease striping; a radiometric offset correction for both TIRS bands; and a slight improvement to the geolocation of the TIRS data.”

More specifically, as outlined in the calibration notices:

  • OLI bands 1-8 will have reflectance changes of up to 0.8 percent.
  • OLI cirrus band 9 will have a more substantial reflectance change of about 7 percent.
  • Vertical striping in OLI bands for dark uniform areas, such as water, will be reduced.
  • TIRS offsets will remove about 2.1 K from band 10 and about 4.4 K from band 11.

So if you’re using Landsat 8, be sure to check the reprocessing details to evaluate whether the changes will impact your analysis.

For more info on Landsat: http://landsat.usgs.gov/

EnMAP Coral Reef Simulation – The first of its kind

The GFZ German Research Center for Geosciences and HySpeed Computing announce the first ever simulation of a coral reef scene using the EnMAP End-to-End Simulation tool. This synthetic, yet realistic, scene of French Frigate Shoals will be used to help test marine and coral reef related analysis capabilities of the forthcoming EnMAP hyperspectral satellite mission.

EeteS EnMAP Simulation FFS

EeteS simulation of EnMAP scene for French Frigate Shoals, Hawaii

EnMAP (Environmental Mapping and Analysis Program) is a German hyperspectral satellite mission scheduled for launch in 2017. As part of the satellite’s development, the EnMAP End-to-End Simulation tool (EeteS) was created at GFZ to provide accurate simulation of the entire image generation, calibration and processing chain. EeteS is also being used to assist with overall system design, the optimization of fundamental instrument parameters, and the development and evaluation of data pre-processing and scientific-exploitation algorithms.

EeteS has previously been utilized to simulate various terrestrial scenes, such as agriculture and forest areas, but until now had not previously been used for generating a coral reef scene. Considering the economic and ecologic importance of coral reef ecosystems, the ability to refine existing analysis tools and develop new algorithms prior to launch is a critical step towards efficiently implementing new reef remote sensing capabilities once EnMAP is operational.

The input imagery for the French Frigate Shoals simulation was derived from a mosaic of four AVIRIS flightlines, acquired in April 2000 as part of an airborne hyperspectral survey of the Northwestern Hawaiian Islands by NASA’s Jet Propulsion Laboratory. Selection of this study area was based in part on the availability of this data, and in part due to the size of the atoll, which more than adequately fills the full 30 km width of an EnMAP swath. In addition to flightline mosaicking, image pre-processing included atmospheric and geographic corrections, generating a land/cloud mask, and minimizing the impact of sunglint. The final AVIRIS mosaic was provided as a single integrated scene of at-surface reflectance.

For the EeteS simulation, the first step was to transform this AVIRIS mosaic into raw EnMAP data using a series of forward processing steps that model atmospheric conditions and account for spatial, spectral, and radiometric differences between the two sensors. The software then simulates the full EnMAP image processing chain, including onboard calibration, atmospheric correction and orthorectification modules to ultimately produce geocoded at-surface reflectance.

The resulting scene visually appears to be an exact replica of the original AVIRIS mosaic, but more importantly now emulates the spatial and spectral characteristics of the new EnMAP sensor. The next step is for researchers to explore how different hyperspectral algorithms can be used to derive valuable environmental information from this data.

For more information on EnMAP and EeteS: http://www.enmap.org/

EeteS image processing and above description performed with contributions from Drs. Karl Segl and Christian Rogass (GFZ German Research Center for Geosciences).

Satellite Overlays in Google Earth – The CEOS Visualization Environment

Are you planning a remote sensing mission? Do you want to see the coverage of different satellite instruments before acquiring imagery or conducting your fieldwork? If so, then it’s definitely worth exploring the CEOS Visualization Environment (COVE) tool to visualize where and when different instruments are observing your study area.

COVE

COVE is “a browser-based system that leverages Google-Earth to display satellite sensor coverage areas and identify coincidence scene locations.” It is a collaborative project developed by the Committee on Earth Observation Satellites (CEOS) Working Group on Calibration and Validation and the NASA CEOS System Engineering Office (SEO). The COVE tool is available online at: http://www.ceos-cove.org/

The COVE web portal includes a suite of three main tools for planning remote sensing missions: the core COVE tool, which provides visualizations of instrument coverage; the Rapid Acquisition Tool, which allows users to identify and predict when sensors will cover specified study areas; and the Mission and Instrument Browser, which provides descriptions of the hundreds of different missions and instruments included in the COVE database.

So what can COVE do for you? As example, let’s assume you want to acquire coincident Landsat-8 and WorldView-2 imagery over the reefs of southwestern Puerto Rico later this year. You can use COVE to calculate when and where instrument coverage will overlap, and hence schedule your associated fieldwork and other mission planning accordingly. In this example, as shown here, Landsat-8 and WorldView-2 will overlap southwestern Puerto Rico on Nov-25-2013.

COVE_Landsat_WorldView

Ground swaths for Landsat-8 (left) and WorldView-2 (right) on Nov-25-2013

COVE_Puerto_Rico

Ground swath overlap for Landsat-8 and WorldView-2 on Nov-25-2013 in southwestern Puerto Rico

As you would expect, COVE includes many other options. Among these is the ability to incorporate different overlays, such as average annual and monthly cloud cover and precipitation, as well as simultaneously display up to four globes at a time. Additionally, results can also be saved and exported to STK, KML, and as a 2D global image. Given its usefulness and versatility, COVE has definitely found a permanent home in our mission planning toolbox.

COVE_Landsat_Globe

Global ground swath for Landsat-8 on Nov-25-2013

About CEOS: “Established in 1984, the Committee on Earth Observation Satellites (CEOS) coordinates civil space-borne observations of the Earth.  Participating agencies strive to enhance international coordination and data exchange and to optimize societal benefit. Currently, 53 members and associate members made up of space agencies, national, and international organizations participate in CEOS planning and activities.”

For more on CEOS: http://www.ceos.org/

Eyes on the Earth – An interactive tool for exploring NASA satellite visualizations

Just the other day we were discussing earth observing missions and a question arose regarding availability of a graphic or video illustrating the current satellite orbits. So we set out to answer that question and quickly discovered NASA’s ‘Eyes on the Earth.’

Have you seen this application? If you’re interested in satellites and remote sensing it’s definitely worth your time to download and explore. Just visit http://eyes.jpl.nasa.gov/earth/ and hit the start button to install the desktop application.

NASA is known for producing stunning visualizations that transform complex science into meaningful informational graphics. Anyone who has attended a NASA presentation or visited their website can attest to the thought and innovation that go into creating these products. The ‘Eyes on the Earth’ application is no exception to this creativity, bringing together an array of exceptional visualizations into a single integrated package.

The opening visualization of ‘Eyes on the Earth’ shows the current real-time position and orbital paths of NASA’s Earth observing satellites. The display is interactive, so you can easily rotate the planet into any desired position. There are also options to increase the speed of the visualization (bottom of screen), switch to full screen (lower right), toggle on/off city names and topography (upper right drop down menu), and adjust between realistic day/night lighting versus full illumination (lower right). The entire visualization can also be viewed in anaglyph 3D, assuming you have a pair of appropriate 3D glasses.

Eyes on the Earth

Want to learn more about a particular mission or specific type of observation? The application allows you to select from a number of pre-defined visualizations that portray different ‘vital signs of the planet’, including global temperature, carbon dioxide, carbon monoxide, sea level, and ozone. The different options can be accessed through buttons at the top of the interface or by selecting an individual satellite from within the central display. Users can then adjust the date range as well as type of data used for generating the display (daily versus three day averaging). With all these options, there’s a wealth of information available at your fingertips.

Eyes on the Earth

As a bonus, the NASA Eyes Visualization also includes ‘Eyes on the Solar System’ and ‘Eyes on Exoplanets’, which are equally intriguing to explore.

So sit back, set the application to full screen, and enjoy the experience.

Remote Sensing Archives – An overview of online resources for lidar data

Previous posts on data access have focused on general resources for obtaining remote sensing imagery – Getting your hands on all that imagery – and specific resources related to imaging spectrometry – A review of online resources for hyperspectral imagery. To add to this compendium of data resources, the following includes an overview of online archives for lidar data.

USGS lidar Mount St Helens

Lidar image of Mount St. Helens (courtesy USGS)

Lidar (light detection and ranging), also commonly referred to as LiDAR or LIDAR, is an “active” remote sensing technology, whereby laser pulses are used to illuminate a surface and the reflected return signals from these pulses are used to indicate the range (distance) to that surface. When combined with positional information and other data recorded by the airborne system, lidar produces a three-dimensional representation of the surface and the objects on that surface. Lidar technology can be utilized for terrestrial applications, e.g. topography, vegetation canopy height and infrastructure surveys, as well as aquatic applications, e.g. bathymetry and coastal geomorphology.

Below is an overview of archives that contain lidar data products and resources:

  • CLICK (Center for LIDAR Information Coordination and Knowledge) provides links to different publically available USGS lidar resources, including EAARL, the National Elevation Dataset and EarthExplorer. The CLICK website also hosts a searchable database of lidar publications and an extensive list of links to relevant websites for companies and academic institutions using lidar data in their work.
  • EAARL (Experimental Advanced Airborne Research Lidar) is an airborne sensor system that has the capacity to seamlessly measure both submerged bathymetric surfaces and adjacent terrestrial topography. By selecting the “Data” tab on the EAARL website, and then following links to specific surveys, users can view acquisition areas using Google Maps and access data as ASCII xyz files, GeoTIFFs and LAS files (a standardized lidar data exchange format).
  • NED (National Elevation Dataset) is the USGS seamless elevation data product for the United States and its territorial islands. NED is compiled using the best available data for any given area, where the highest resolution and most accurate of which is derived from lidar data and digital photogrammetry. NED data are available through the National Map Viewer in a variety of formats, including ArcGRID, GeoTIFF, BIL and GridFloat. However, to access the actual lidar data, and not just the resulting integrated products, users need to visit EarthExplorer.
  • EarthExplorer is a consolidated data discovery portal for the USGS data archives, which includes airborne and satellite imagery, as well as various derived image products. EarthExplorer allows users to search by geographic area, date range, feature class and data type, and in most cases instantly download selected data. To access lidar data, which are provided as LAS files, simply select the lidar checkbox under the Data Sets tab as part of your search criteria.
  • JALBTCX (Joint Airborne Lidar Bathymetry Technical Center of Expertise) performs data collection and processing for the U.S. Army Corps of Engineers, the U.S. Naval Meteorology and Oceanography Command and NOAA. The JALBTCX website includes a list of relevant lidar publications, a description of the National Coastal Mapping Program, and a link to data access via NOAA’s Digital Coast.
  • Digital Coast is a service provided by NOAA’s Coastal Services Center that integrates coastal data accessibility with software tools, technology training and success stories. Of the 950 data layers currently listed in the Digital Coast archive, lidar data represents nearly half of the available products. Searching for lidar data can be achieved using the Data Access Viewer and selecting “elevation” as the data type in your search, or by following the “Data” tab on the Digital Coast home page and entering “lidar” for the search criteria. The data is available in a variety of data formats, including ASCII xyz, LAS, LAZ, GeoTIFF and ASCII Grid, among others.
  • NCALM (National Center for Airborne Laser Mapping) is a multi-university partnership funded by the U.S. National Science Foundation, whose mission is to provide community access to high quality lidar data and support scientific research in airborne laser mapping. Data is accessible through the OpenTopography portal, either in KML format for display in Google Earth, as pre-processed DEM products, or as lidar point clouds in ASCII and LAS formats.

Lidar can be useful on its own, e.g. topography and bathymetry, and can also be merged with other remote sensing data, such as multispectral and hyperspectral imagery, to provide valuable three-dimensional information as input for further analysis. For example, lidar derived bathymetry can be used as input to improve hyperspectral models of submerged environments in the coastal zone. There has also been more widespread use of full-waveform lidar, which provides increased capacity to discriminate surface characteristics and ground features, as well as increased use of lidar return intensity, which can be used to generate a grayscale image of the surface.

What is readily apparent is that as the technology continues to improve, data acquisition becomes more cost effective, and data availability increases, lidar will play an important role in more and more remote sensing investigations.

HyPhoon is coming!

HyPhoon

HySpeed computing is proud to announce the coming launch of HyPhoon:

  • A gateway for the access and exchange of datasets, applications and knowledge.
  • A pathway for you to expand your impact and extend your community outreach.
  • A framework for the development and deployment of scientific applications.
  • A resource for obtaining and sharing geospatial datasets.
  • A mechanism for improved technology transfer.
  • A marketplace for scientific computing.

The initial HyPhoon release, coming soon in mid-2013, will focus on providing the community with free and open access to remote sensing datasets. This data will be available for the community to use in research projects, class assignments, algorithm development, application testing and validation, and in some cases also commercial applications. In other words, in the spirit of encouraging innovation, these datasets are offered as a community resource and open to your creativity. We look forward to seeing what you accomplish.

We’ll be announcing the official HyPhoon release here, so stay tuned to be the first to access the data as soon as it becomes available!

Our objective when developing these datasets has been to focus on quality rather than any predefined set of content requirements. Thus, dataset contents are variable. Many of the datasets include a combination of imagery, validation data, and example output. Some datasets include imagery of the same area acquired using different sensors, different resolutions, or different dates. And other datasets simply include unique image examples.

The datasets originate largely from the community itself. In some cases data also originates from public domain repositories as well as from commercial image providers. We are also interested in hearing your thoughts on new datasets that will benefit the community. Contact us with your ideas and if our review team approves the project then we will work with you to add your data to the gateway.

Beyond datasets, HyPhoon will also soon include a marketplace for community members to access advanced algorithms, and sell user-created applications. Are you a scientist with an innovative new algorithm? Are you a developer who can help transform research code into user applications? Are you working in the application domain and have ideas for algorithms that would benefit your work? Are you looking to reach a larger audience and expand your impact on the community? If so, we encourage you to get involved in our community.

For more on HySpeed Computing: www.hyspeedcomputing.com