Conservation Technology – Mapping our environment using the Carnegie Airborne Observatory

Remote sensing was recently on stage at TEDGlobal 2013, where Greg Asner highlighted how advanced technology can be leveraged for improved conservation of our natural environment.

Asner, a scientist in the Department of Global Ecology at the Carnegie Institution for Science, believes that “technology is absolutely critical to managing our planet, but even more important is the understanding and wisdom to apply it.”

In his TED talk, Asner illustrates how data acquired from hyperspectral and lidar instruments on the Carnegie Airborne Observatory can be used to generate kaleidoscopic 3D maps of natural ecosystems in unprecedented detail. These maps, which define data layers such as the biodiversity landscape and carbon geography, provide crucial knowledge that is necessary to make more informed conservations decisions.

Greg Asner: Ecology from the air (13:50)

For more information on the Carnegie Airborne Observatory: http://cao.stanford.edu/

Advertisements

Conservation Remote Sensing – Inviting you to get involved

Blue MarbleThe Conservation Remote Sensing Working Group (CRSWG) is extending an open invitation to join their growing community and participate in advancing conservation efforts through remote sensing.

Are you passionate about conservation? Are you experienced in remote sensing? Do you have ideas on how remote sensing and geospatial data can be better incorporated into conservation management and planning? Have you developed a new image analysis tool that will benefit the conservation community? If you’ve answered yes to any or all of these questions, then the CRWSG is interested in your input.

The mission of CRSWG is “to increase conservation effectiveness through enhanced integration of remote sensing technologies in research and applications.” Under the leadership of Dr. Robert Rose from the Wildlife Conservation Society, the CRSWG focuses on four key themes that are critical for fostering effective conservation:

Research and Collaboration – “Greater collaboration amongst remote sensing scientists and practitioners will create a critical link between the novel and visionary work of remote sensing scientists and the on-the-ground experience of conservation practitioners.”

Capacity Development – Improved education and awareness “will allow conservationists around the globe to broaden their understanding of applied remote sensing, gain skill sets needed for finding, processing and analyzing remotely sensed data and associated products, and develop an understanding of remote sensing that allows them to integrate remote sensing into conservation.”

Best Practices – “A series of standards and recommendations” are needed “for the best use of remote sensing for conservation applications…, focusing on data collection, generation and integration, validation, models and remote sensing-derived products, as well as application of new technologies such as unmanned aerial vehicles.”

Communications – “The goal is to curate and share critical information, both inward, to the conservation remote sensing community, and outward, communicating to the broader conservation community and others who may be interested in the applications of remote sensing for conservation.”

To learn more, or better yet to join the group, just follow this link, or visit Google Groups and search for “Conservation Remote Sensing”, and then select “Apply to join group”.

HySpeed Computing is participating in this community and encourages you to add your voice to the discussion.

Satellite-Derived Bathymetry – New products available from DigitalGlobe

DigitalGlobe recently hosted a webinar describing the new bathymetry and benthic products that are being derived using imagery from their WorldView-2 sensor. The methodology uses a set of algorithms developed ‘in-house’ at DigitalGlobe, and while current capabilities show excellent results, plans are already underway to deliver additional improvements.

DigitalGlobe bathymetry

As a quick refresher, WorldView-2 is a high-resolution satellite with 8 multispectral bands at 1.85m spatial resolution and one panchromatic band at 0.46m resolution. Included in the multispectral bands is a new coastal band (400-450nm) that provides enhanced water penetration and hence improved capabilities for aquatic remote sensing. Note that Landsat 8 has adopted a similar approach to band selection, now including a similar coastal band (430-450nm), albeit at the coarser spatial resolution of 30m. WorldView-2 was launched on October 8, 2009 from Vandenberg Air Force Base, and interested users have the option of either ordering archived imagery or scheduling new acquisitions for their particular study area.

The WorldView-2 benthic products include options for water depth, water quality and bottom type classification. Since all three products are inherently interrelated in terms of what the satellite sensor measures, rather than independently derive the individual products, DigitalGlobe utilizes a methodology that resolves all three products simultaneously.

For those interested in specifics, the methodology uses HydroLight to generate a table of surface reflectance spectra based on discrete combinations of water depth, water optical properties and bottom reflectance, and then employs a spectral matching technique to find the simulated spectra that best match the spectra for each measured pixel. The output for each pixel is then assumed equivalent to the parameters used for generating the matching spectra in HydroLight.

The recent webinar focused specifically on output for the water depth product, with example validation results shown for the Bahamas (Lee Stocking Island, Princess Cays and West End), Puerto Rico and St. Croix. In all cases the WorldView-2 output shows strong agreement with acoustic and lidar bathymetry measurements. However, it is noted that accuracy decreases with increasing turbidity, which is to be expected given the inherent physical limitations of this imaging problem (i.e., light needs to be able to penetrate through the water column, reflect off the bottom, and propagate back to the sensor).

DigitalGlobe estimates the following specifications for these products. Water depth is accurate to 10-15% actual depth in clear water over sand to a maximum depth of 20-30m, and accurate to 15-20% actual depth in clear water over dark substrates (sea grass, algae, coral) to a maximum depth of 15m. The other products are still being evaluated, but current indications are that water quality output will be valid to 20-30m in clear water and 5m in turbid water, and bottom classification will be valid to 20m over sand and 15m over dark substrates.

DigitalGlobe bathymetry

Whereas the current technology for these products relies on the spectral domain, future improvements will focus on incorporating the angular and temporal domains. Specifically, this includes taking advantage of WorldView-2 capabilities to obtain multiple sequential views from different angles. This allows calculations of stereoscopic relationships as well as wave kinematics (i.e., wave motion), which can both be used to derive information on bathymetry and thereby improve the product.

If you’re interested in seeing the full webinar: http://digitalglobe.adobeconnect.com/p7g0rab5vr1/

Images supplied courtesy DigitalGlobe.

Highlights from VISualize 2013 – Connecting the remote sensing community

VISualize 2013 is an annual conference hosted by Exelis Visual Information Solutions, and co-sponsored by HySpeed Computing, that brings together thought leaders in the geosciences to discuss the latest trends in remote sensing. The focus this year was on “Connecting the Community to Discuss Global Change and Environmental Monitoring.”

With more than 20 presentations and ample discussion throughout, it was an insightful and very informative conference. Some highlights from VISualize 2013 include:

  • Jim Irons (NASA Goddard Space Flight Center) presented a summary of the Landsat 8 mission, including details on data distribution, sensor specifications, measurement capabilities, and new band designations. He also noted that responsibility for the instrument was officially shifted from NASA to USGS on 30 May 2013, signifying completion of all on-orbit checkouts and the initiation of public data dissemination. Currently more than 20,000 images are already available for users to download.
  • Mark Braza (U.S. Government Accountability Office) described the use of propensity score analysis to estimate the effectiveness of land conservation programs. The approach utilizes statistical analysis to identify control groups from amongst land areas associated with, but not included in, established conservation projects, and then leverages these control areas as a means to assess the relative impact of land conservation efforts.
  • Nasser Olwero and Charles Huang (World Wildlife Fund) summarized objectives of the Global Impact Award that WWF recently received from Google. In this project WWF will be using state-of-the-art technology, specifically animal tracking tags, analytical software that optimizes ranger patrolling, and airborne remote sensing, to reduce the impact of animal poaching and protect valuable species like elephants, rhinos and tigers.
  • Matthew Ramspott (Frostburg State University) presented findings from a study using Landsat data to assess wetland change along the Louisiana coast. A key aspect of the analysis was the methodology used to automatically delineate the land/water interface. Results demonstrate the value of using remote sensing to monitor long-term change in coastal wetlands and assess impact from storm damage, flood management decisions and rising sea levels.
  • Robert Rose (Wildlife Conservation Society) outlined the top 10 conservation challenges that can be addressed using remote sensing. The list of challenges result from a NASA funded workshop in early 2013, and are defined according to 10 general themes: species distribution and abundance; species movement and life stages; ecosystem properties and processes; climate change; fast response; protected areas; ecosystem services; conservation effectiveness; land cover change and agricultural expansion; and degradation and disturbance regime. In each theme the objective is to focus on achievable conservation outcomes with clear pathways for putting technology into practice.
  • Robert Rose also spoke about the revitalization of the Conservation Remote Sensing Working Group (CRSWG), which aims to encourage discussion around four main topic areas: research and collaboration; capacity development; communications; and best practices. To join the conversation, just look for CRSWG on Google Groups and contact the group admin to get involved.

Interested in more information on these and other speakers? Exelis VIS will soon be posting copies of all the VISualize 2013 presentations to their website. We’re looking forward to seeing everyone again next year at VISualize 2014.

World Wildlife FundAs thanks to WWF for opening its doors to VISualize, Exelis VIS and HySpeed Computing proudly contributed donations to WWF on behalf of the speakers. Pictured from left to right: Matt Hallas (Exelis VIS), Nasser Olwero (WWF), Charles Huang (WWF) and James Goodman (HySpeed Computing).

VISualize 2013 – Global change and environmental monitoring

HySpeed Computing is honored to be co-sponsoring VISualize 2013, the annual conference hosted by Exelis Visual Information Solutions that brings together thought leaders in the geosciences to discuss the latest trends in remote sensing. The focus of this year’s conference is “Connecting the Community to Discuss Global Change and Environmental Monitoring.” The event is being held at the World Wildlife Fund conference center in Washington, DC from 11-13 June 2013.

HySpeed Computing president Dr. James Goodman will be attending VISualize and presenting a talk on the “Power of Community Data Sharing.” The presentation will explore the status of data sharing in the remote sensing community and discuss how you can benefit and get involved.

The conference’s keynote address will be presented by Dr. Jim Irons, Associate Deputy Director for Atmospheres in the Earth Sciences Division at NASA Goddard Space Flight Center (GSFC) and Project Scientist for the NASA Landsat Data Continuity Mission (LDCM). Dr. Irons will be discussing some of the history of the LDCM mission, and what we should expect now that Landsat 8 is operational and delivering data to the science community. We are looking forward to hearing what he has to say, as well as engaging with all the other speakers and attendees. Hope to see you there.

Here are some other upcoming remote sensing conferences later this year that may also be of interest:

WHISPERS: Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing, 25-28 June 2013, Gainesville, FL, USA, http://www.ieee-whispers.com/

ESRI: International User Conference, 8-12 July 2013, San Diego, CA, USA, http://www.esri.com/events/user-conference

IGARSS: International Geoscience and Remote Sensing Symposium, 21-26 July 2013, Melbourne, Australia, http://www.igarss2013.org/

SPIE: Remote Sensing, 23-26 September 2013, Dresden, Germany, http://spie.org/x6262.xml

Coral Reef Remote Sensing – A new book for coral reef science and management

Coral Reef Remote SensingAnnouncing publication of “Coral Reef Remote Sensing: A Guide for Mapping, Monitoring and Management”, edited by Dr. James Goodman, president of HySpeed Computing, and his colleagues Dr. Sam Purkis from Nova Southeastern University and Dr. Stuart Phinn from University of Queensland.

This ground breaking new book explains and demonstrates how satellite, airborne, ship and ground based imaging technologies, collectively referred to as “remote sensing”, are essential for understanding and managing coral reef environments around the world.

The book includes contributions from an international group of leading coral reef scientists and managers, demonstrating how remote sensing resources are now unparalleled in the types of information they produce, the level of detail provided, the area covered  and the length of the time over which they have been collected.  When used in combination with field data and knowledge of coral reef ecology and oceanography, remote sensing contributes an essential source of information for understanding, assessing and managing coral reefs around the world.

The authors have produced a book that comprehensively explains how each remote sensing data collection technology works, and more importantly how they are each used for coral reef management activities around the world.

In the words of Dr. Sylvia Earle – renowned scientist and celebrated ocean explorer:

This remarkable book, Coral Reef Remote Sensing: A Guide for Mapping, Monitoring and Management, for the first time documents the full range of remote sensing systems, methodologies and measurement capabilities essential to understanding more fully the status and changes over time of coral reefs globally. Such information is essential and provides the foundation for policy development and for implementing management strategies to protect these critically endangered ecosystems.

I wholeheartedly recommend this book to scientists, students, managers, remote sensing specialists, and anyone who would like to be inspired by the ingenious new ways that have been developed and are being applied to solve one of the world’s greatest challenges: how to take care of the ocean that takes care of us.

If it had been available in 1834, Charles Darwin would surely have had a copy on his shelf.

We invite you to explore the book (including a sneak peek inside the chapters) and see how you can put the information to use on your own coral reef projects.

Rockets in the Rainforest – ESA deploys three new satellites from its spaceport in French Guiana

ESA Proba-V

Proba-V (artist rendition; courtesy ESA)

Earlier this week, on a rainy night at the European Spaceport located along the tropical coast of French Guiana, the Vega launch vehicle successfully rocketed into space and completed its mission of deploying three new satellites into orbit.

This was just the second deployment of Vega, representing a momentous occasion for both the European Space Agency and Arianespace – the company operating the launch – and marking another significant step forward in the commercial transition of launch operations.

Also celebrating the Vega launch were the teams behind the three satellites deployed during the mission. These include:

  • Proba-V. From the European Space Agency, Proba-V was the primary payload of the Vega mission. The “V” stands for vegetation, and the satellite is designed as a follow-on mission to the vegetation imagers included on the French Spot-4 and -5 satellites. Proba-V contains a moderate-resolution four-band multispectral instrument capable of mapping complete global vegetation cover once every two days.
  • VNREDSat-1A. Representing the first Earth observing satellite from Vietnam, this is a high-resolution five-band imager (four multispectral bands and one panchromatic) designed for monitoring and managing natural resources, assessing the impacts of climate change, and improving response to natural disasters.
  • ESTCube-1. This represents the very first satellite from Estonia. ESTCube-1 is a CubeSat built primarily by students at the University of Tartu. Its main scientific objective is to deploy and test an electric solar wind sail, a novel method of space propulsion.

You may ask why the European Spaceport, aka Guiana Space Center, is located in the equatorial rainforest of South America, which upon first consideration may seem like an unlikely location. The answer is that the Spaceport’s location has some significant advantages. First and foremost, its location near the equator makes the Spaceport ideal for launching satellites into geosynchronous orbit, and given the higher rotational speed of the planet near the equator, this also lends efficiency to the launch process (i.e., saving fuel and money). Second, the region is relatively unpopulated and not at risk from earthquakes or hurricanes, thereby significantly reducing risk from any unforeseen disasters. The European Spaceport also has a rich launch history extending back nearly 50 years. Originally established by France in 1964, the Spaceport has been used by the European Space Agency since its founding in 1975.

With all this talk lately about new satellites, it may also seem like space is starting to get crowded. It is! The issue isn’t necessarily all the new satellites being launched, but rather all the derelict space debris that remains in orbit. To address this issue, there has been significant international discussion lately to develop debris removal plans. While such an endeavor is certainly going to be costly and logistically difficult, space cleanup is a necessary step towards ensuring the integrity of current and future satellites.

But for now let’s celebrate the success of these latest satellite missions and make sure the data is put to good use.

Accessing The Oceans – See how Marinexplore is connecting users with a world of data

Are you working on an oceanographic or marine related project where you need to identify and access the many data resources available for your study area? Marinexplore is now making this process easier than ever.

As with many fields of research, the realm of ocean science includes a staggering volume of data that has already been collected, and continues to be collected, by different organizations and government entities around the world. While there is a general movement throughout science towards improved data availability, greater standardization of data formats, and increased adoption of data interoperability standards, efficiently searching and accessing all of this data can still be a cumbersome task.

Marinexplore

MARINEXPLORE

To address this challenge, Marinexplore has created a centralized resource for the ocean science community to quickly access multiple data sources from a single framework. Using an interface built on top of Google Maps, users can easily search from amongst the many available data collections, select relevant data for a particular project, and download the resulting dataset in a single file. Not only does this cut down on search time, it also simplifies a number of data integration and preprocessing steps. Users can also save, store and share created datasets, as well as collaborate with other users.

So how does it work? Marinexplore currently has access to more than 1.2 billion in situ measurements. This predominantly includes publically available data that has been acquired from ocean instruments, such as buoys, drifters, fixed platforms, and ships, as well as products generated from satellite sensors. Data access is a free service, but users must first register with Marinexplore to set up an account. Users can then create up to three datasets per day, where the size is initially limited to no more than 5 million measurements per dataset, but with options to significantly expand this limit to 25 million measurements per dataset (and beyond) by referring other users.

Marinexplore reportedly also has plans to expand functionality of its system, such as providing an API (Application Programming Interface) for developing specialized applications, functionality for data streaming, and the ability to run oceanographic models. But these features have yet to be added. For now Marinexplore is focused on establishing a user community and delivering data to interested users.

So go check out the data, and see what’s available for you to use on your next project.

For more information on Marinexplore: http://marinexplore.com/

LDCM Prepares for Launch – Continuing the Landsat legacy

NASA hosted a press conference last Thursday to highlight final preparations as LDCM – the Landsat Data Continuity Mission – prepares for launch on Feb 11, 2013.

LDCMWith the countdown to this momentous launch drawing near – now less than a month away – the excitement amongst the panel members at the press conference was clearly evident. Many eyes from around the world will be expectantly watching Vandenberg Air Force Base in California next month as LDCM lifts-off aboard an Atlas 5 rocket.

LDCM, which will officially be renamed Landsat 8 once in orbit, is important as the next satellite in the long-lived, and incredibly successful, Landsat program. With its new capabilities and improved instrument design, LDCM was described by panelists at the recent briefing as the “best Landsat ever”, delivering both “more data per day” and “higher quality data” than any previous Landsat.

Successful launch of LDCM becomes particularly crucial in light of recent announcements that Landsat 5 will soon be decommissioned and considering ongoing concerns related to the operational limitations of Landsat 7 [note that Landsat 6 failed to reach orbit during its 1993 launch, and thus never made it into operation]. While numerous other satellites provide their own capabilities for Earth observation, the unprecedented 40-year continuity of the Landsat program enables analysis of long-term trends and unique capabilities for the assessment and monitoring of our changing planet. LDCM thus represents more than just a new satellite, but a critically important continuation of numerous global science applications.

LDCM contains two science instruments, the Operational Land Imager (OLI) and the Thermal Infrared Sensor (TIRS). The OLI instrument will measure a total of nine spectral bands: eight multispectral bands at 30m resolution in the visible, near-infrared, and shortwave infrared; and one panchromatic band at 15m resolution in the visible. Unlike previous Landsat missions, which used whiskbroom instruments, the OLI utilizes a pushbroom configuration, thereby enabling improved signal-to-noise performance, i.e., improved data quality. The TIRS instrument will measure two thermal bands at 100m resolution, subdividing the single thermal band previously measured by Landsat 4-7. With this overall design configuration, the OLI and TIRS instruments together maintain the legacy of previous Landsat instruments, while at the same time expanding to include additional capabilities.

LDCM is a joint program between NASA and USGS, with NASA handling instrument development, spacecraft design and launch, and USGS handling flight operations, and data processing, distribution and archiving. Importantly, as has been the policy since 2008, data from Landsat 8 will be offered free to all interested users.

The launch will be streamed live on NASA TV (http://www.nasa.gov/multimedia/nasatv/). Don’t miss this historic occasion.

For more information on LDCM: http://ldcm.nasa.gov/

The NEON Science Mission – Open access ecological data

NEONInterested in assessing the ecological impacts of climate change? How about investigating the complex dynamics of ecological response to land use change and invasive species? What types of data would you need to perform such research at regional and continental scales? These are just some of the ambitious science questions being addressed by NEON – the National Ecological Observatory Network.

Sponsored by the U.S. National Science Foundation, NEON is an integrated network of 60 sites located throughout the U.S. where infrastructure is being put in place to collect a uniform array of scientific data. The hypothesis is that by providing consistent measurements and observations across the U.S., scientists will be better able to answer critical questions related to environmental change. Originally conceived in 1997, and followed by many years of planning, NEON entered its construction phase in 2012. Current plans are for the network to be fully operational in 2017, and for data from NEON to be collected for 30 years.

The 60 NEON sites encompass the continental U.S., Alaska, Hawaii and Puerto Rico. Sites were selected to represent a diverse range of vegetation communities, climate zones, land types, and land-use categories. The current list of NEON data products to be collected at each site include over 500 different entries, including both field and remote sensing observations. Items range from as detailed as genetic sequences and isotope analyses of field samples to as broad as temperature and wind speed measurements from meteorological instruments. Additionally, in what has become a welcome trend within the community, NEON data is being distributed using an open access policy.

Of particular interest to the remote sensing community is that NEON includes an Airborne Observation Platform (AOP) that will be used to collect digital photography, imaging spectroscopy data, and full-waveform LiDAR data. To accommodate the geographic distribution of NEON sites, this same suite of remote sensing instrumentation will be deployed on three different aircraft. Note that remote sensing data collection, as well as testing and validation of analysis protocols, has already begun and preliminary data is available upon request.

Given its scope, it is clear that the data and information derived from the NEON project will have a profound impact on our understanding of the natural environment and our ability to assess ecological change.

For more information on NEON: http://www.neoninc.org/