The National Strategy for Earth Observation – Data management and societal benefits

White House OSTP

Office of Science and Technology Policy

Earlier this month the U.S. National Science and Technology Council released its report on the National Strategy for Civil Earth Observations. This is the first step towards building a National roadmap for the more efficient utilization and management of U.S. Earth observing resources.

Current U.S. capabilities in Earth observation, as summarized in the report, are distributed across more than 100 different programs, including those at both Federal agencies and various non-Federal organizations (e.g., state and local governments, academic institutions, and commercial companies). This extends far beyond just the well-known satellite programs operated by NASA and NOAA, encompassing a variety of other satellite and airborne missions being conducted around the country, as well as a host of other land- and water-based observing systems. From a National perspective this represents not just a complex array of programs and organizations to manage, but also an increasingly voluminous collection of data products and information to store and make available for use.

With an objective towards improving the overall management and utilization of the various Earth observing resources, the National Strategy outlines two primary organizational elements. The first element addresses a “policy framework” for prioritizing investments in observing systems that support specified “societal benefit areas,” and the second element speaks to the need for improved methods and policies for data management and information dissemination.

The National Strategy also lays the foundation for ultimately developing a National Plan for Civil Earth Observations, with initial publication targeted for fiscal year 2014 and subsequent versions to be repeated every three years thereafter. As indicated by its title, the National Plan will provide the practical details and fundamental information needed to implement the various Earth observing objectives. Additionally, by periodically revisiting and reassessing technologic capabilities and societal needs, the “approach of routine assessment, improved data management, and coordinated planning is designed to enable stable, continuous, and coordinated Earth-observation capabilities for the benefit of society.”

The overall motivation behind the National Strategy and National Plan is the recognized societal importance of Earth observation. Specifically, “Earth observations provide the indispensable foundation for meeting the Federal Government’s long-term sustainability objectives and advancing U.S. social, environmental, and economic well-being.” With that in mind, the National Strategy specifies twelve key “societal benefit areas”: agriculture and forestry, biodiversity, climate, disasters, ecosystems, energy and mineral resources, human health, ocean and coastal resources and ecosystems, space weather, transportation, water resources, weather, and reference measurements. Also deemed relevant are the various technology developments that span across all focus areas, such as advances in sensor systems, data processing, algorithm development, data discovery tools, and information portals.

The National Strategy additionally presents a comprehensive outline for a unified data management framework, which sets the fundamental “expectations and requirements for Federal agencies involved in the collection, processing, stewardship, and dissemination of Earth-observation data.” The framework addresses needs across the entire data life cycle, beginning with the planning stages of data collection, progressing through data organization and formatting standards, and extending to data accessibility and long-term data stewardship. Also included is the need to provide full and open data access to all interested users, as well as optimize interoperability, thereby facilitating the more efficient exchange of data and information products across the entire community.

With this National Strategy, the U.S. is defining a unified vision for integrating existing resources and directing future investments in Earth observation. We are looking forward to reading the upcoming National Plan, which is targeted for release later this year.

To access a copy of the National Strategy report, visit the Office of Science and Technology Policy: http://www.whitehouse.gov/administration/eop/ostp

Advertisements

Space Apps and You – Check out the official list of this year’s challenges

Space Apps ChallengeIn preparation for next month’s 48-hour global codeathon taking place April 20-21, the Space Apps team has now released their official list of challenges, with more than 50 opportunities in which you can participate: http://spaceappschallenge.org/challenges/

If you’re not already familiar with the Space Apps Challenge, it’s an amazing opportunity to work together with collaborators around the world to “solve current challenges relevant to both space exploration and social need.” For more information, please refer to our previous Space Apps post and visit the official website: http://spaceappschallenge.org

General categories for this year’s codeathon include software, hardware, citizen science, and data visualization. And they’re not just all about space… check out the challenges for “Backyard Poultry Farmer”, “Lego Rovers”, “OpenROV”, “Off the Grid”, “In the Sky with Diamonds” and “Renewable Energy Explorer.” There’s something for everyone.

While we think all of the proposed challenges are exciting, we here at HySpeed Computing have a particular interest in geospatial technologies and would therefore like to highlight some specific challenges speaking directly to the areas of remote sensing and earth observation:

  • Earth Day Challenge – “How can space data help us here on Earth? April 22 is Earth Day. Create a visualization of how pollution levels have changed over time.  Many pollution problems have been vastly improved, such as water pollution in the Great Lakes, and air pollution in Los Angeles. But others have significantly worsened, like CO2 emissions and ozone depletion.”
  • The Blue Marble – “Create an app, platform or website that consolidates a collection of space imagery and makes it more accessible to more people.”
  • EarthTiles – “Take global imagery data from Landsat, EOS, Terra, and other missions and turn them in to imagery tiles that can be used in an open source street map. This would enable incredible amounts of visualization and contextual data narration to occur, especially if such tiles were able to be updated on a regular basis as new data is released.”
  • Seeing Water From Space – “Create a web map of Chile water resources, showing how they have changed over time and how their changes over time relate to changes in climate.”
  • Earth From Space – “Using images taken by middle school students through the ISS EarthKAM program, create an educational application that allows users to overlay EarthKAM images on a 3D model of earth, annotate and comment on the images, and share their work via social media. This application can be web based or designed as a mobile application for an Android device.”

We’re excited to see what you can accomplish using Earth imagery. So look for an in-person event near you, or participate virtually from your own location. Good luck and happy coding!

Get Your Code On – The NASA International Space Apps Challenge is coming

Space Apps ChallengeGet ready to flex your fingers and exercise your brain. Next month from April 20-21 NASA is hosting the International Space Apps Challenge – a 48-hour global hackathon. Everyone and anyone is welcome to attend.

The International Space Apps Challenge is a 2 day technology development event during which citizens from around the world will work together to solve current challenges relevant to both space exploration and social need.”

There are currently over 75 cities around the world hosting in-person events, including one extraordinary location orbiting the Earth onboard the International Space Station. These in-person events, which are independently organized by local volunteers, provide the opportunity to interact and collaborate with fellow participants. However, if there’s not a venue near you, or you think best when you’re in your own environment, you can also contribute to the event virtually from your own location, perhaps even gathering a group of friends to create your own mini-event. To participate, either in-person or virtually, simply visit the Space Apps website – spaceappschallenge.org – and register.

You don’t have to be a ‘space’ professional to contribute, nor do you need to be an expert programmer. The objective of the event is to bring together a diverse group of people with a varied range of skills and backgrounds that have “a passion for changing the world and are willing to contribute.” Last year’s event, which numbered more than 2000 participants, received contributions from an assorted array of scientists, engineers, artists, writers, entrepreneurs and many more. All that’s required is a spirit of innovation.

Participants in the App Challenge are encouraged to work as teams, but can also work alone, to “solve challenges relevant to improving life on Earth and life in space.” Top solutions from each location will be entered in the global competition, where winners will be awarded prizes and recognition for their achievements. A list of suggested challenges will be posted on the event website in the near future. In the meantime, current suggestions for this year’s event include:

  • “Help tell the ‘why’ of space exploration through the creation of compelling narratives and visualizations of the stories and data from NASA’s history.”
  • “Design a CubeSat (or constellation of CubeSats) that can utilize extra space onboard future robotic Mars missions to help us understand more about the Red Planet.”
  • “Help revitalize antiquated data by creating open source tools to transform, display, and visualize data.”

But this is just a small sample of the challenges yet to come. Perhaps you also have your own ideas and would like to develop your own unique contribution. This too is welcomed. And you can even get started in advance (but the bulk of the work should be completed the weekend of the event) so that you have a head start and hit the ground running.

So grab your favorite laptop, tablet, or other device and get comfortable. It’s time to code. Good luck everyone!

For more on the NASA Space Apps Challenge: http://spaceappschallenge.org  

Data Management and You – A broader look at research data requirements

This is Part 2 of a discussion series on data management requirements for government funded research.

As discussed in the previous installment of this series, data management has become an integral requirement of government funded research projects. Not only are there considerations related to the fact that the research was supported using taxpayer funding, and hence the data should be made available to the public, but data sharing also helps expand the impact and influence of your own research.

Part 1 of this series focused on the data management requirements of the National Science Foundation (NSF). In Part 2 below we look at the National Aeronautics and Space Administration (NASA), the Australian Research Council (ARC), and the Research Councils United Kingdom (RCUK).

NASAAs with the NSF proposal process, NASA requires a data-sharing plan to be incorporated as part of any proposal response. Specifically, as described in the NASA Guidebook for Proposers, the “Proposer shall provide a data-sharing plan and shall provide evidence (if any) of any past data-sharing practices.” Unlike NSF, which requires a separate two-page plan, the NASA data-sharing plan must be incorporated within the main body of the proposal as part of the Scientific-Technical-Management section. Additionally, as something important to keep in mind, NASA also specifies that “all data taken through research programs sponsored by NASA are considered public”, “NASA no longer recognizes a ‘proprietary’ period for exclusive use of any new scientific data”, and that “all data collected through any of its funded programs are to be placed in the public domain at the earliest possible time following their validation and calibration.” This means no more holding data in reserve until such time as a researcher has completed their work and published their results. Instead, NASA is taking a strong stand on making its data publically available as soon as possible.

RCUKLooking now to the United Kingdom, the RCUK explicitly defines data sharing as a core aspect of its overall mission and responsibility as a government organization. As part of its Common Principles on Data Policy, RCUK states that “publically funded research data are a public good, produced in the public interest, which should be made openly available with as few restrictions as possible in a timely and responsible manner.” To achieve this objective, the individual Research Councils that comprise the RCUK each incorporate their own specific research requirements that conform to this policy. For example, the Natural Environment Research Council (NERC) specifies in its Grants and Fellowships Handbook that each proposal must include a one-page Outline Data Management Plan. If funded, researchers will then work with the NERC Environmental Data Centres to devise a final Data Management Plan. And at the conclusion of the project, researchers will coordinate with the Data Centres to transfer their data and make it available for others to use.

ARCThe Australian Research Council also encourages data sharing as an important component to funded research projects. While the ARC does not specify the need for data management plans in its proposals, the policies listed in the ARC Funding Rules explicitly encourage “depositing data and any publications arising from a research project in an appropriate subject and/or institutional repository.” Additionally, as part of the final reporting requirements for most ARC awards, the researcher must specify “how data arising from the project have been made publically accessible where appropriate.” It is also common amongst the various funding opportunities to include a discussion in the required Project Description on strategies to communicate research outcomes. While not explicitly stated, data sharing can certainly play an important role in meeting such needs to disseminate and promote research achievements.

Government agencies clearly recognize the importance of data, and are making it a priority in their research and proposal requirements. So don’t forget to include data management as part of your next proposal planning process.

Data Management and You – A look at NSF requirements for data organization and sharing

This is Part 1 of a discussion series on data management requirements for government funded research.

NSF LogoData is powerful. From data comes information, and from information comes knowledge. Data is also a critical component in quantitative analysis and for proving or disproving scientific hypotheses. But what happens to data after it has served its initial purpose? And what are your obligations, and potential benefits, with respect to openly sharing data with other researchers?

Data management and data sharing is viewed with growing importance in today’s research environment, particularly in the eyes of government funding agencies. Not only is data management a requirement for most proposals using public funding, but effective data sharing can also work in your favor in the proposal review process. Consider the difference between two accomplished scientists, both conducting excellent research and publishing results in top journals, but only one of the scientists has made their data openly available, with 1000s of other researchers already accessing the data for further research. Clearly, the scientist who has shared data has created substantial additional impact on the community and facilitated a greater return on investment beyond the initially funded research. Such accomplishments can and should be included in your proposals.

As one example, let’s examine the data management requirements for proposals submitted to the U.S. National Science Foundation. What is immediately obvious when preparing a NSF proposal is the need to incorporate a two-page Data Management Plan as an addendum to your project description. Requirements for the Data Management Plan are outlined in the “Proposal and Award Policies and Procedures Guide” (2013) within both the “Grant Proposal Guide” and the “Award & Administration Guide.” Note that in some cases there are also specific data management requirements for particular NSF Directorates and Divisions, which need to be adhered to when submitting proposals for those programs.

To quote from the Data Management Plan: “Investigators are expected to share with other researchers, at no more than incremental cost and within a reasonable time, the primary data, samples, physical collections and other supporting materials created or gathered in the course of work under NSF grants. Grantees are expected to encourage and facilitate such sharing.” Accordingly, the proposal will need to describe the “types of data… to be produced in the course of the project”, “the standards to be used for data and metadata format”, “policies for access and sharing”, “policies and provisions for re-use, re-distribution, and the production of derivatives”, and “plans for archiving data… and for preservation of access.” Proposals can not be submitted without such a plan.

As another important consideration, if “any PI or co-PI identified on the project has received NSF funding (including any current funding) in the past five years”, the proposal must include a description of past awards, including a synopsis of data produced from these awards. Specifcally, in addition to a basic summary of past projects, this description should include “evidence of research products and their availability, including, but not limited to: data, publications, samples, physical collections, software, and models, as described in any Data Management Plan.”

Along these same lines, NSF also recently adjusted the requirements for the Biographical Sketch to specify “Products” rather than just “Publications.” Thus, in addition to previous items in this category, such as publications and patents, “Products” now also includes data.

The overall implication is that NSF is interesting in seeing both past success in impacting the community through data sharing and specific plans on how this will be accomplished in future research. Be sure to keep this this in mind when writing your next proposal. And remember… data is powerful.

For more information on NSF proposal guidelines: http://www.nsf.gov/bfa/dias/policy/

The International Space Station – A unique platform for Earth observation

International Space Station

International Space Station (image: NASA).

From the launch of its first module in 1998, to its first onboard crew in 2000, to today’s expansive labyrinth of space laboratories and solar arrays, the International Space Station is a technological marvel and an icon of human innovation. The ISS is well known as a research facility for medicine, biology, physical science, physiology, space flight, and cutting-edge engineering.

But did you know the ISS is home to a unique collection of facilities that can be used for Earth observing instruments. These include the Columbus – External Payload Facility (Columbus-EPF), the Expedite the Processing of Experiments to the Space Station Logistics Carrier (ELC), the Japanese Experiment Module – Exposed Facility (JEM-EF) and the Window Observational Research Facility (WORF). The Columbus-EPF, ELC and JEM-EF support external payloads, which means remote sensing instruments can be mounted on the outside of the ISS. And the WORF supports internal payloads by providing a very high optical quality optical window through which instruments can view the Earth below.

Advantages of using the ISS as a remote sensing platform include the capacity to install new instruments with relative ease (at least compared with launching free flying satellites), the ability to remove instruments and transport them to ground for post-mission analysis, and in some cases the option for crew interaction with the instrument while onboard the station. The ISS also has a unique orbit that differs from most Earth observing satellites, thus allowing image collection at different times of the day and under different illumination conditions than would otherwise be possible. These same orbit characteristics; however, can also be a disadvantage with respect to image uniformity and operational requirements. Additionally, in some cases the solar arrays can interfere with observations during certain situations. Nonetheless, the ISS is an excellent facility to test new instruments and explore new remote sensing capabilities.

So what are some of the instruments that have flown on the ISS? There are HICO (Hyperspectral Imager for the Coastal Ocean) and RAIDS (Remote Atmospheric and Ionospheric Detection System), which are integrated into a single payload installed on the JEM-EF. As the name implies, HICO is a hyperspectral instrument that has been optimized for imaging the nearshore aquatic environment. RAIDS is used for measuring the major constituents of Earth’s upper atmosphere, specifically the thermosphere and the ionosphere. There is also the EVC (Earth Viewing Camera), which is part of the European Technology Exposure Facility (EuTEF) deployed on Columbus-EPF. EVC is a commercial off-the-shelf digital camera used to capture color images of the Earth’s surface. A final example is ISSAC (International Space Station Agricultural Camera), which is installed as an internal payload on WORF. ISSAC is a multispectral camera measuring wavelengths in the visible and near infrared that primarily targets agricultural areas in the northern Great Plains. ISSAC is also particularly exciting, since it was largely built and operated by students at the University of North Dakota.

Those are just a few of the exciting instruments on the ISS. There are many others… and more instruments planned for future missions.

Are you involved in instrument development or image analysis related to the ISS? We’d love to hear your thoughts and stories and share them with the community.

For more about the ISS: http://www.nasa.gov/mission_pages/station/main/

HySpeed Computing – Reviewing our progress and looking ahead

Join HySpeed Computing as we highlight our accomplishments from the past year and look ahead to what is sure to be a productive 2013.

The past year has been an eventful period in the life of HySpeed Computing. This was the year we introduced ourselves to the world, launching our website (www.hyspeedcomputing.com) and engaging the community through social media platforms (i.e., using the usual suspects – Facebook, LinkedIn and Google+). If you’re reading this, you’ve found our blog, and we thank you for your interest. We’ve covered a variety of topics to date, from community data sharing and building an innovation community to Earth remote sensing and high performance computing. As our journey continues we will keep sharing our insights and also welcome you to participate in the conversation.

August of 2012 marked the completion of work on our grant from the National Science Foundation (NSF). The project, funded through the NSF SBIR/STTR and ERC Collaboration Opportunity, was a partnership between HySpeed Computing and the Bernard M. Gordon Center for Subsurface Sensing and Imaging Systems at Northeastern University. Through this work we were able to successfully utilize GPU computing to accelerate a remote sensing tool for the analysis of submerged marine environments. Our accelerated version of the algorithm was 45x faster than the original, thus approaching the capacity for real-time processing of this complex algorithm.

HySpeed Computing president, Dr. James Goodman, also attended a number of professional conferences and meetings during 2012. This included showcasing our achievements in geospatial applications and community data sharing at the International Coral Reef Symposium in Cairns, Australia and the NASA HyspIRI Science Workshop in Washington, D.C. and presenting our accomplishments in remote sensing algorithm acceleration at the GPU Technology Conference in Pasadena, CA and the VISualize Conference in Washington, D.C. Along the way we met, and learned from, a wonderfully diverse group of other scientist and professionals. We are encouraged by the direction and dedication we see in the community and honored to be a contributor to this progress.

HyPhoonSo what are we looking forward to in 2013? You heard it here first – we are proud to soon be launching HyPhoon, a gateway for accessing and sharing both datasets and applications. The initial HyPhoon release will focus on providing the community with free and open access to remote sensing datasets. We already have data from the University of Queensland, Rochester Institute of Technology, University of Puerto Rico at Mayaguez, NASA, and the Galileo Group, with additional commitments from others. This data will be available for the community to use in research projects, class assignments, algorithm development, application testing and validation, and in some cases also commercial applications. In other words, in the spirit of encouraging innovation, these datasets are offered as a community resource and open to your creativity. We look forward to seeing what you accomplish.

Connect with us through our website or via social media to become pre-registered to be the first to access the data as soon as it becomes available!

Beyond datasets, HyPhoon will also soon include a marketplace for community members to access advanced algorithms, and sell user-created applications. Are you a scientist with an innovative new algorithm? Are you a developer who can help transform research code into user applications? Are you working in the application domain and have ideas for algorithms that would benefit your work? Are you looking to reach a larger audience and expand your impact on the community? If so, we encourage you to get involved in our community.

HySpeed Computing is all about accelerating innovation and technology transfer.

LDCM Prepares for Launch – Continuing the Landsat legacy

NASA hosted a press conference last Thursday to highlight final preparations as LDCM – the Landsat Data Continuity Mission – prepares for launch on Feb 11, 2013.

LDCMWith the countdown to this momentous launch drawing near – now less than a month away – the excitement amongst the panel members at the press conference was clearly evident. Many eyes from around the world will be expectantly watching Vandenberg Air Force Base in California next month as LDCM lifts-off aboard an Atlas 5 rocket.

LDCM, which will officially be renamed Landsat 8 once in orbit, is important as the next satellite in the long-lived, and incredibly successful, Landsat program. With its new capabilities and improved instrument design, LDCM was described by panelists at the recent briefing as the “best Landsat ever”, delivering both “more data per day” and “higher quality data” than any previous Landsat.

Successful launch of LDCM becomes particularly crucial in light of recent announcements that Landsat 5 will soon be decommissioned and considering ongoing concerns related to the operational limitations of Landsat 7 [note that Landsat 6 failed to reach orbit during its 1993 launch, and thus never made it into operation]. While numerous other satellites provide their own capabilities for Earth observation, the unprecedented 40-year continuity of the Landsat program enables analysis of long-term trends and unique capabilities for the assessment and monitoring of our changing planet. LDCM thus represents more than just a new satellite, but a critically important continuation of numerous global science applications.

LDCM contains two science instruments, the Operational Land Imager (OLI) and the Thermal Infrared Sensor (TIRS). The OLI instrument will measure a total of nine spectral bands: eight multispectral bands at 30m resolution in the visible, near-infrared, and shortwave infrared; and one panchromatic band at 15m resolution in the visible. Unlike previous Landsat missions, which used whiskbroom instruments, the OLI utilizes a pushbroom configuration, thereby enabling improved signal-to-noise performance, i.e., improved data quality. The TIRS instrument will measure two thermal bands at 100m resolution, subdividing the single thermal band previously measured by Landsat 4-7. With this overall design configuration, the OLI and TIRS instruments together maintain the legacy of previous Landsat instruments, while at the same time expanding to include additional capabilities.

LDCM is a joint program between NASA and USGS, with NASA handling instrument development, spacecraft design and launch, and USGS handling flight operations, and data processing, distribution and archiving. Importantly, as has been the policy since 2008, data from Landsat 8 will be offered free to all interested users.

The launch will be streamed live on NASA TV (http://www.nasa.gov/multimedia/nasatv/). Don’t miss this historic occasion.

For more information on LDCM: http://ldcm.nasa.gov/

The NEON Science Mission – Open access ecological data

NEONInterested in assessing the ecological impacts of climate change? How about investigating the complex dynamics of ecological response to land use change and invasive species? What types of data would you need to perform such research at regional and continental scales? These are just some of the ambitious science questions being addressed by NEON – the National Ecological Observatory Network.

Sponsored by the U.S. National Science Foundation, NEON is an integrated network of 60 sites located throughout the U.S. where infrastructure is being put in place to collect a uniform array of scientific data. The hypothesis is that by providing consistent measurements and observations across the U.S., scientists will be better able to answer critical questions related to environmental change. Originally conceived in 1997, and followed by many years of planning, NEON entered its construction phase in 2012. Current plans are for the network to be fully operational in 2017, and for data from NEON to be collected for 30 years.

The 60 NEON sites encompass the continental U.S., Alaska, Hawaii and Puerto Rico. Sites were selected to represent a diverse range of vegetation communities, climate zones, land types, and land-use categories. The current list of NEON data products to be collected at each site include over 500 different entries, including both field and remote sensing observations. Items range from as detailed as genetic sequences and isotope analyses of field samples to as broad as temperature and wind speed measurements from meteorological instruments. Additionally, in what has become a welcome trend within the community, NEON data is being distributed using an open access policy.

Of particular interest to the remote sensing community is that NEON includes an Airborne Observation Platform (AOP) that will be used to collect digital photography, imaging spectroscopy data, and full-waveform LiDAR data. To accommodate the geographic distribution of NEON sites, this same suite of remote sensing instrumentation will be deployed on three different aircraft. Note that remote sensing data collection, as well as testing and validation of analysis protocols, has already begun and preliminary data is available upon request.

Given its scope, it is clear that the data and information derived from the NEON project will have a profound impact on our understanding of the natural environment and our ability to assess ecological change.

For more information on NEON: http://www.neoninc.org/

Satellites, Technology and Palm Trees – It’s all about CSTARS

HySpeed Computing recently visited CSTARS to learn more about University of Miami’s remote sensing facility.

A short distance south of Miami, just down the Florida Turnpike, and surrounded by a lush tropical landscape, is an advanced satellite download and image analysis facility. CSTARS – the Center for Southeastern Tropical Advanced Remote Sensing – owned and operated by the University of Miami – provides state-of-the-art research capabilities and data access for scientists around the world.

CSTARS

(Image credit: CSTARS)

The CSTARS facility was purchased by the University of Miami in 2000, and after strategically phasing in new infrastructure and operations, officially launched in 2003. Present today on the 78 acre grounds are two 11.3m antennas, one 20m antenna, and several buildings containing the system controls and data processing equipment. These antennas are the links that ultimately connect data from orbiting satellites to researchers on the ground. While much of the facility has been designed to be automated, a number of scientists and staff are located onsite for handling satellite and antenna operations, conducting research investigations, and performing system maintenance.

Since its inception, imagery downloaded through the CSTARS antennas has been used as the foundation for a diverse range of scientific studies, including topics such as assessing water level changes in the Florida Everglades, investigating land subsidence trends, tracking global ocean currents, and monitoring volcanic activity. CSTARS is also notably included as a partner in one of the Department of Homeland Security Centers of Excellence, whose particular objective involves improving maritime and port security. And the facility has also played an important role in damage assessments and relief efforts associated with Hurricane Katrina, the Haiti earthquake and the Deepwater Horizon oil spill.

Interestingly, CSTARS operates satellite communications for the U.S. Antarctic Amundsen-Scott South Pole Station, thus providing a vital link between the station and the outside world. This is accomplished using GOES-3 (Geostationary Operational Environmental Satellite), a weather satellite launched in 1978, which ceased functioning in that regard in 1993, but was reactivated as a communications satellite in 1995. It’s a great example of resourcefulness and engineering dexterity that has enabled GOES-3 to continue operating and provide voice and data transmission with the Antarctic station.

CSTARS also represents an interesting piece of technology history in south Florida. The facility previously served as the site of the U.S. Naval Observatory Time Service Alternate Master Clock Station, which had the responsibility of providing accurate “atomic time” given a failure of the master atomic clock in Washington, D.C. During the same period it was also one of the stations for the U.S. Very Long Baseline Interferometry program, which contributed fundamental data for better understanding dynamics of the Earth’s surface. A further look through the U.S. Naval Observatory Station history reveals many other unique scientific advances were achieved at the facility through the years.

From past to present, technology has played an integral part in defining this small plot of land in south Florida. For more information on CSTARS: http://www.cstars.miami.edu/