Remote Sensing Analysis in the Cloud – Introducing the HICO Image Processing System

HySpeed Computing is pleased to announce release of the HICO Image Processing System – a prototype web application for on-demand remote sensing image analysis in the cloud.

HICO IPS: Chesapeake Bay Chla

What is the HICO Image Processing System?

The HICO IPS is an interactive web-application that allows users to specify image and algorithm selections, dynamically launch analysis routines in the cloud, and then see results displayed directly in the map interface.

The system capabilities are demonstrated using imagery collected by the Hyperspectral Imager for the Coastal Ocean (HICO) located on the International Space Station, and example algorithms are included for assessing coastal water quality and other nearshore environmental conditions.

What is needed to run the HICO IPS?

No specialized software is required. You just need an internet connection and a web browser to run the application (we suggest using Google Chrome).

How is this different than online map services?

This is an application-server, not a map-server, so all the results you see are dynamically generated on-demand at your request. It’s remote sensing image analysis in the cloud.

What software was used to create the HICO IPS?

The HICO IPS is a combination of commercial and open-source software; with core image processing performed using the recently released ENVI Services Engine.

What are some of the advantages of this system?

The system can be configured for any number of different remote sensing instruments and applications, thus providing an adaptable framework for rapidly implementing new algorithms and applications, as well as making these applications and their output readily available to the global user community.

Try it out today and let us know what you think: http://hyspeedgeo.com/HICO/

 

Related posts

Calculating a land/water mask using HICO IPS

Deriving chlorophyll concentration using HICO IPS

Evaluating water optical properties using HICO IPS

Characterizing shallow coastal environments using HICO IPS

Advertisements

ENVI Analytics Symposium – Come explore the next generation of geoanalytic solutions

HySpeed Computing is pleased to announce our sponsorship of the upcoming ENVI Analytics Symposium taking place in Boulder, CO from August 25-26, 2015.

ENVI Analytics Symposium

The ENVI Analytics Symposium (EAS) will bring together the leading experts in remote sensing science to discuss technology trends and the next generation of solutions for advanced analytics. These topics are important because they can be applied to a diverse range of needs in environmental and natural resource monitoring, global food production, security, urbanization, and other fields of research.

The need to identify technology trends and advanced analytic solutions is being driven by the staggering growth in high-spatial and spectral resolution earth imagery, radar, LiDAR, and full motion video data. Join your fellow thought leaders and practitioners from industry, academia, government, and non-profit organizations in Boulder, Colorado for an intensive exploration of the latest advancements of analytics in remote sensing.

Core topics to be discussed at this event include Algorithms and Analytics, Applied Research, Geospatial Big Data, and Remote Sensing Phenomenology.

For more information: http://www.exelisvis.com/eas/HOME.aspx

We look forward to seeing you there.

Geospatial Solutions in the Cloud

Source: Exelis VIS whitepaper – 12/2/2014 (reprinted with permission)

What are Geospatial Analytics?

Geospatial analytics allow people to ask questions of data that exist within a spatial context. Usually this means extracting information from remotely sensed data such as multispectral imagery or LiDAR that is focused on observing the Earth and the things happening on it, both in a static sense or over a period of time. Familiar examples of this type of geospatial analysis include Land Classification, Change Detection, Soil and Vegetative indexes, and depending on the bands of your data, Target Detection and Material Identification. However, geospatial analytics can also mean analyzing data that is not optical in nature.

So what other types of problems can geospatial analytics solve? Geospatial analytics comprise of more than just images laid over a representation of the Earth. Geospatial analytics can ask questions of ANY type of geospatial data, and provide insight into static and changing conditions within a multi-dimensional space. Things like aircraft vectors in space and time, wind speeds, or ocean currents can be introduced into geospatial algorithms to provide more context to a problem and to enable new correlations to be made between variables.

Many times, advanced analytics like these can benefit from the power of cloud, or server-based computing. Benefits from the implementation of cloud-based geospatial analytics include the ability to serve on-demand analytic requests from connected devices, run complex algorithms on large datasets, or perform continuous analysis on a series of changing variables. Cloud analytics also improve the ability to conduct multi-modal analysis, or processes that take into account many different types of geospatial information.

Here we can see vectors of a UAV along with the ground footprint of the sensor overlaid in Google Earth™, as well as a custom interface built on ENVI that allows users to visualize real-time weather data in four dimensions (Figure 1).

geospatial_cloud_fig1

Figure 1 – Multi-Modal Geospatial Analysis – data courtesy NOAA

These are just a few examples of non-traditional geospatial analytics that cloud-based architecture is very good at solving.

Cloud-Based Geospatial Analysis Models 

So let’s take a quick look at how cloud-based analytics work. There are two different operational models for running analytics, the on-demand model and the batch process model. In a on-demand model (Figure 2), a user generally requests a specific piece of information from a web-enabled device such as a computer, a tablet, or a smart phone. Here the user is making the request to a cloud based resource.

geospatial_cloud_fig2

Figure 2 – On-Demand Analysis Model

Next, the server identifies the requested data and runs the selected analysis on it. This leverages scalable server architecture that can vastly decrease the amount of time it takes to run the analysis and eliminate the need to host the data or the software on the web-enabled device. Finally, the requested information is sent back to the user, usually at a fraction of the bandwidth cost required to move large amounts of data or full resolution derived products through the internet.

In the automated batch process analysis model (Figure 3), the cloud is designed to conduct prescribed analysis to data as it becomes available to the system, reducing the amount of manual interaction and time that it takes to prepare or analyze data. This system can take in huge volumes of data from various sources such as aerial or satellite images, vector data, full motion video, radar, or other data types, and then run a set of pre-determined analyses on that data depending on the data type and the requested information.

geospatial_cloud_fig3

Figure 3 – Automated Batch Process Model

Once the data has been pre-processed, it is ready for consumption, and the information is pushed out to either another cloud based asset, such as an individual user that needs to request information or monitor assets in real-time, or simply placed into a database in a ‘ready-state’ to be accessed and analyzed later.

The ability for this type of system to leverage the computing power of scalable server stacks enables the processing of huge amounts of data and greatly reduces the time and resources needed to get raw data into a consumable state.

Solutions in the Cloud

HySpeed Computing

Now let’s take a look at a couple of use cases that employ ENVI capabilities in the cloud. The first is a web-based interface that allows users to perform on-demand geospatial analytics on hyperspectral data supplied by HICO™, the Hyperspectral Imager for the Coastal Ocean (Figure 4). HICO is a hyperspectral imaging spectrometer that is attached to the International Space Station (ISS) and is designed specifically for sampling the coastal ocean in an effort to further our understanding of the world’s coastal regions.

geospatial_cloud_fig4

Figure 4 – The HICO Sensor – image courtesy of NASA

Developed by HySpeed Computing, the prototype HICO Image Processing System (Figure 5) allows users to conduct on-demand image analysis of HICO’s imagery from a web-based browser through the use of ENVI cloud capabilities.

geospatial_cloud_fig5

Figure 5 – The HICO Image Processing System – data courtesy of NASA

The interface exposes several custom ENVI tasks designed specifically to take advantage of the unique spectral resolution of the HICO sensor to extract information characterizing the coastal environment. This type of interface is a good example of the on-demand scenario presented earlier, as it allows users to conduct on-demand analysis in the cloud without the need to have direct access to the data or the computing power to run the hyperspectral algorithms.

The goal of this system is to provide ubiquitous access to the robust HICO catalog of hyperspectral data as well as the ENVI algorithms needed to analyze them. This will allow researchers and other analysts the ability to conduct valuable coastal research using web-based interfaces while capitalizing on the efforts of the Office of Naval Research, NASA, and Oregon State University that went into the development, deployment, and operation of HICO.

Milcord

Another use case involves a real-time analysis scenario that comes from a company called Milcord and their dPlan Next Generation Mission Manager (Figure 6). The goal of dPlan is to “aid mission managers by employing an intelligent, real-time decision engine for multi-vehicle operations and re-planning tasks” [1]. What this means is that dPlan helps folks make UAV flight plans based upon a number of different dynamic factors, and delivers the best plan for multiple assets both before and during the actual operation.

geospatial_cloud_fig6

Figure 6 – The dPlan Next Generation Mission Manager

Factors that are used to help score the flight plans include fuel availability, schedule metrics based upon priorities for each target, as well as what are known as National Image Interpretability Rating Scales, or NIIRS (Figure 7). NIIRS are used “to define and measure the quality of images and performance of imaging systems. Through a process referred to as “rating” an image, the NIIRS is used by imagery analysts to assign a number which indicates the interpretability of a given image.” [2]

geospatial_cloud_fig7

Figure 7 – Extent of NIIRS 1-9 Grids Centered in an Area Near Calgary

These factors are combined into a cost function, and dPlan uses the cost function to find the optimal flight plan for multiple assets over a multitude of targets. dPlan also performs a cost-benefit analysis to indicate if the asset cannot reach all targets, and which target might be the lowest cost to remove from the plan, or whether another asset can visit the target instead.

dPlan employs a custom ESE application to generate huge grids of Line of Sight values and NIIRs values associated with a given asset and target (Figure 8). dPlan uses this grid of points to generate route geometry, for example, how close and at what angle does the asset need to approach the target.

geospatial_cloud_fig8

Figure 8 – dPlan NIIRS Workflow

The cloud-computing power leveraged by dPlan allows users to re-evaluate flight plans on the fly, taking into account new information as it becomes available in real time. dPlan is a great example of how cloud-based computing combined with powerful analysis algorithms can solve complex problems in real time and reduce the resources needed to make accurate decisions amidst changing environments.

Solutions in the Cloud

So what do we do here at Exelis to enable folks like HySpeed Computing and Milcord to ask these kinds of questions from their data and retrieve reliable answers? The technology they’re using is called the ENVI Services Engine (Figure 9), an enterprise-ready version of the ENVI image analytics stack. We currently have over 60 out-of-the-box analysis tasks built into it, and are creating more with every release.

geospatial_cloud_fig9

Figure 9 – The ENVI Services Engine

The real value here is that ENVI Services Engine allows users to develop their own analysis tasks and expose them through the engine. This is what enables users to develop unique solutions to geospatial problems and share them as repeatable processes for others to use. These solutions can be run over and over again on different data and provide consistent dependable information to the persons requesting the analysis. The cloud based technology makes it easy to access from web-enabled devices while leveraging the enormous computing power of scalable server instances. This combination of customizable geospatial analysis tasks and virtually limitless computing power begins to address some of the limiting factors of analyzing what is known as big data, or datasets so large and complex that traditional computing practices are not sufficient to identify correlations within disconnected data streams.

Our goal here at Exelis is to enable you to develop custom solutions to industry-specific geospatial problems using interoperable, off-the-shelf technology. For more information on what we can do for you and your organization, please feel free to contact us.

Sources:

[1] 2014. Milcord. “Geospatial Analytics in the Cloud: Successful Application Scenarios” webinar. https://event.webcasts.com/starthere.jsp?ei=1042556

[2] 2014. The Federation of American Scientists. “National Image Interpretability Rating Scales”. http://fas.org/irp/imint/niirs.htm

 

VISualize 2014 – Call for abstracts now open

UPDATE (6-April-2015): Announcing the ENVI Analytics Symposium – taking place in Boulder, CO from August 25-26, 2015. Those looking for the VISualize symposium, which has been indefinitely postponed, should consider attending the inaugural ENVI Analytics Symposium as a great opportunity to explore the next generation of geoanalytic solutions.

Just announced!  VISualize 2014, the annual IDL & ENVI User Group Meeting hosted by Exelis Visual Information Solutions, will be taking place October 14-16 at the World Wildlife Fund in Washington, DC.

HySpeed Computing is honored to once again be co-sponsoring this year’s VISualize. We are excited to speak with you and see your latest remote sensing applications.

At this year’s meeting HySpeed Computing will be presenting results from our latest project – a prototype cloud computing system for remote sensing image processing and data visualization. We hope to see you there.

Abstract submission deadline is September 12. Register today!

VISualize2014

“Please join us at VISualize 2014, October 14th – 16th, at the World Wildlife Fund in Washington, DC. This three day event explores real-world applications of ENVI and IDL with a specific focus on Modern Approaches for Remote Sensing & Monitoring Environmental Extremes.

Suggested topics include:

  • Using new data platforms such as UAS, microsatellites, and SAR sensors for environmental assessments
  • Land subsidence monitoring and mapping techniques
  • Remote sensing solutions for precision agriculture mapping
  • Drought, flood, and extreme precipitation event monitoring and assessment
  • Wildfire and conservation area monitoring, management, mitigation, and planning
  • Monitoring leaks from natural gas pipelines

Don’t miss this excellent opportunity to connect with industry thought leaders, researchers, and scientists.”

Register today!

 

Advantages of Cloud Computing in Remote Sensing Applications

The original version of this post appears in the June 26 edition of Exelis VIS’s Imagery Speaks, by James Goodman, CEO HySpeed Computing

Below we explore the role of cloud computing in geospatial image processing, and the advantages this technology provides to the overall remote sensing toolbox.

The underlying concept of cloud computing is not new; dating back to the advent of the client-server model in mainframe computing, where the utilization of local devices to perform tasks on a server, or set of connected servers, has a long history within the computing industry.

With the rise of the personal computer, and the relative cost efficiency of memory and processing speed for these systems, there ensued a similarly rich history of computing using the local desktop environment.

As a result, in many application domains, including that of remote sensing, a dichotomy developed in the computing industry, with a large portion of the user community reliant on personal computers and mostly the government and big business utilizing large-scale servers.

More recently, however, there has been an industry-wide surge in the prevalence of cloud computing applications within the general user community. Driven in large part by rapidly growing data volumes and the profound increase and diversity of mobile computing devices, as well as a desire for access to centralized analytics, cloud computing is now a common component in our everyday experience.

Where does cloud computing fit within remote sensing? Given the online availability of weather maps and high-resolution satellite base maps, it can be argued that cloud computing is already regularly used in remote sensing. However, there are an innumerable number of other remote sensing applications, with societal and economic benefits, that are not currently available in the cloud.

Since most of these applications are not directed at the consumer market, but instead relevant predominantly to business, government, education and scientific concerns, what then are the advantages of cloud computing in remote sensing?

  • Provides online, on-demand, scalable image processing capabilities.
  • Delivers image-derived products and visualization tools to a global user community.
  • Allows processing tools to be efficiently co-located with large image databases.
  • Removes software barriers and hardware requirements from non-specialists.
  • Facilitates rapid integration and deployment of new algorithms and processing tools.
  • Accelerates technology transfer in remote sensing through improved application sharing.
  • Connects remote sensing scientists more directly with the intended end-users.

At HySpeed Computing we are partnering with Exelis Visual Information Solutions to develop a cloud computing platform for processing data from the Hyperspectral Imager for the Coastal Ocean (HICO) – a uniquely capable sensor located on the International Space Station (ISS). The backbone of the computing framework is based on the ENVI Services Engine, with a user interface built using open-source software tools such as GeoServer and Leaflet.

A prototype version of the web-enabled HICO processing system will soon be publically available for testing and evaluation by the community. Links to access the system will be provided on our website once it is released.

We envision a remote sensing future where the line between local and cloud computing becomes obscured, where applications can be interchangeably run in any computing environment, where developers can utilize their programming language of choice, where scientific achievements and innovations are readily shared through a distributed processing network, and where image-derived information is rapidly distributed to the global user community.

And what’s most significant about this vision is that the future is closer than you imagine.

About HySpeed Computing: Our mission is to provide the most effective analysis tools for deriving and delivering information from geospatial imagery. Visit us at hyspeedcomputing.com.

 

NASA Takes Over Navy Instrument On ISS

A version of this article appears in the May 19 edition of Aviation Week & Space Technology, p. 59, Frank Morring, Jr.

HREP on JEMEFA hyperspectral imager on the International Space Station (ISS) that was developed by the U.S. Navy as an experiment in littoral-warfare support is finding new life as an academic tool under NASA management, and already has drawn some seed money as a pathfinder for commercial Earth observation.

Facing Earth in open space on the Japanese Experiment Module’s porchlike Exposed Facility, the Hyperspectral Imager for Coastal Oceans (HICO) continues to return at least one image a day of near-shore waters with unprecedented spectral and spatial resolution.

HICO was built to provide a low-cost means to study the utility of hyperspectral imaging from orbit in meeting the Navy’s operational needs close to shore. Growing out of its experiences in the Persian Gulf and other shallow-water operations, the Office of Naval Research wanted to evaluate the utility of space-based hyperspectral imagery to characterize littoral waters and conduct bathymetry to track changes over time that could impact operations.

The Naval Research Laboratory (NRL) developed HICO, which was based on airborne hyperspectral imagery technology and off-the-shelf hardware to hold down costs. HICO was launched Sept. 10, 2009, on a Japanese H-2 transfer vehicle as part of the HICO and RAIDS (Remote Atmospheric and Ionospheric Detection System) Experimental Payloads; it returned its first image two weeks later.

In three years of Navy-funded operations, HICO “exceeded all its goals,” says Mary Kappus, coastal and ocean remote sensing branch head at NRL.

“In the past it was blue ocean stuff, and things have moved more toward interest in the coastal ocean,” she says. “It is a much more difficult environment. In the open ocean, multi-spectral was at least adequate.”

NASA, the U.S. partner on the ISS, took over HICO in January 2013 after Navy funding expired. The Navy also released almost all of the HICO data collected during its three years running the instrument. It has been posted for open access on the HICO website managed by Oregon State University.

While the Navy program was open to most researchers, the principal-investigator approach and the service’s multistep approval process made it laborious to gain access on the HICO instrument.

“[NASA] wanted it opened up, and we had to get permission from the Navy to put the historical data on there,” says Kappus. “So anything we collect now goes on there, and then we ask the Navy for permission to put old data on there. They reviewed [this] and approved releasing most of it.”

Under the new regime NRL still operates the HICO sensor, but through the NASA ISS payload office at Marshall Space Flight Center. This more-direct approach has given users access to more data and, depending on the target’s position relative to the station orbit, a chance to collect two images per day instead of one. Kappus explains that the data buffer on HICO is relatively small, so coordination with the downlink via the Payload Operations Center at Marshall is essential to collecting data before the buffer fills up.

Task orders are worked through the same channels. Presenting an update to HICO users in Silver Spring, Md., on May 7, Kappus said 171 of 332 total “scenes” targeted between Nov. 11, 2013, and March 12 were requested by researchers backed by the NRL and NASA; international researchers comprised the balance.

Data from HICO is posted on NASA’s Ocean Color website, where usage also is tracked. After the U.S., “China is the biggest user” of the website data, Kappus says, followed by Germany, Japan and Russia. The types of data sought, such as seasonal bathymetry that shows changes in the bottom of shallow waters, has remained the same through the transition from Navy to NASA.

“The same kinds of things are relevant for everybody; what is changing in the water,” she says.

HICO offers unprecedented detail from its perch on the ISS, providing 90-meter (295-ft.) resolution across wavelengths of 380-960 nanometers sampled at 5.7 nanometers. Sorting that rich dataset requires sophisticated software, typically custom-made and out of the reach of many users.

To expand the user set for HICO and future Earth-observing sensors on the space station, the Center for the Advancement of Science in Space, the non-profit set up by NASA to promote the commercial use of U.S. National Laboratory facilities on the ISS, awarded a $150,000 grant to HySpeed Computing, a Miami-based startup, and [Exelis] to demonstrate an online imaging processing system that can rapidly integrate new algorithms.

James Goodman, president/CEO of HySpeed, says the idea is to build a commercial way for users to process HICO data for their own needs at the same place online that they get it.

“Ideally a copy of this will [be] on the Oregon State server where the data resides,” Goodman says. “As a HICO user you would come in and say ‘I want to use this data, and I want to run this process.’ So you don’t need your own customized remote-sensing software. It expands it well beyond the research crowd that has invested in high-end remote-sensing software. It can be any-level user who has a web browser.”

HySpeed Computing Announces New Project – Remote Sensing on the International Space Station

CASIS Reaches Agreement with HySpeed Computing and Exelis for Hyperspectral Image Analysis Using Cloud Computing

Originally published by CASIS on February 20, 2014

CASIS

KENNEDY SPACE CENTER, FL. (February 20th, 2014) – The Center for the Advancement of Science in Space (CASIS) today announced an agreement with HySpeed Computing and Exelis for a project demonstrating cloud computing capabilities for image processing and remote sensing applications on the International Space Station (ISS). CASIS was selected by NASA in July 2011 to maximize use of the ISS U.S. National Laboratory.

HySpeed Computing and Exelis plan to develop a prototype online, on-demand image processing system using example data from the Hyperspectral Imager for the Coastal Ocean (HICO). The system will leverage the recently released ENVI Services Engine, and include a web-interface for users to access a collection of image processing applications derived from the HICO user community.

HICO is a hyperspectral instrument specializing in visible and near-infrared camera technology, designed specifically for imaging the coastal zone and ocean waters. HICO is part of the first U.S. experiment payload on the Japanese Experiment Module – Exposed Facility (JEM-EF) on the International Space Station (ISS), and has acquired thousands of images from around the globe since its launch in 2009.

“We are excited to be supported by CASIS,” said HySpeed Computing President James Goodman. “We believe this project will demonstrate an effective pathway for inspiring innovation and facilitating technology transfer in the geospatial marketplace.”

“This partnership with HySpeed Computing and Exelis is another example of leveraging existing assets onboard the ISS for terrestrial benefit,” said CASIS Director of Operations, Ken Shields. “During its existence, HICO has proven to be a dynamic camera capable of delivering the unique vantage point of the ISS to better understand our oceans and shorelines.”

For information about CASIS opportunities, including instructions on submitting research ideas, please visit:  www.iss-casis.org/solicitations

Additionally, CASIS currently has a solicitation in remote sensing open to the research community. Letters of intent are required to move forward in the proposal process. Letters of intent are due tomorrow, February 21, 2014. To learn more visit: www.iss-casis.org/Opportunities/Solicitations/RFPRemoteSensing.aspx

# # #

About CASIS: The Center for the Advancement of Science in Space (CASIS) was selected by NASA in July 2011 to maximize use of the International Space Station (ISS) U.S. National Laboratory through 2020. CASIS is dedicated to supporting and accelerating innovations and new discoveries that will enhance the health and wellbeing of people and our planet. For more information, visit: http://www.iss-casis.org/.

About the ISS National Laboratory: In 2005, Congress designated the U.S. portion of the International Space Station as the nation’s newest national laboratory to maximize its use for improving life on Earth, promoting collaboration among diverse users and advancing STEM education. This unique laboratory environment is available for use by other U.S. government agencies and by academic and private institutions, providing access to the permanent microgravity setting, vantage point in low earth orbit and varied environments of space.

# # #

Source: Feb 20, 2014 CASIS press release.

Remote Sensing in the Cloud – Introducing the ENVI Services Engine

remote sensing in the cloudA popular topic these days is cloud computing. And the world of remote sensing is no exception. New developments in software, hardware, and connectivity are offering innovative options for performing remote sensing image analysis and visualization tasks in the cloud.

One example of the recent advance in cloud computing capabilities for geospatial scientists is the development of the ENVI Services Engine by Exelis Visual Information Solutions (Exelis VIS). Using what was previously the domain of desktop computing – this software engine brings the image analysis tools of ENVI into the cloud. This translates into an ability to deploy ENVI processing tools, such as image classification, anomaly detection and change detection, into an online environment. Additionally, because the system uses a HTTP REST interface and was constructed utilizing open source standards, implementing the software is feasible across a variety of operating systems and different hardware devices.

This flexibility of the ENVI Services Engine, and cloud computing in general, speaks directly to the “bring your own device” movement. Rather than being limited to certain operating systems or certain types of hardware, users have many more options to satisfy their preferences. Access and processing thus becomes feasible from a variety of tablets, mobile phones and laptops, in addition to the usual array of desktops and workstations.

As an example, consider the ability to access imagery and derived data layers from your favorite mobile device. Now consider being able to adjust your analysis on-the-fly from this same device based on observations while in the field. With the image processing being tasks handled on remote servers, extensive computing capacity is no longer required on your local device. This enables not just remote access to image processing, but also the ability for on-demand visualization and display of entire databases full of different images and results.

Having the image processing tasks performed on the same servers, or on servers closer to, where the imagery is stored is also more computationally efficient, since imagery does not need to be first transferred to local computers and results then transferred back to the servers. This is particularly relevant for large data archives, where even simple changes to existing algorithms, or the addition of new algorithms, may necessitate re-processing vast volumes of data.

Although the concept of cloud computing is not new, it has become apparent that the software and hardware landscape has evolved, making cloud computing for geospatial analysis significantly more attractive than ever before.

Attendees of the VISualize conference earlier this year received a sneak-peek at the ENVI Services Engine. The software was also recently on display at the GEOINT conference this past October. However, official release of the software isn’t scheduled until early 2013. For more information: http://www.exelisvis.com/

Reef Management in the Cloud – Application of innovative new technologies

International Coral Reef Symposium 2012 – Cairns, Australia – Thoughts from Day 5

The “cloud” and “cloud computing” are becoming increasingly prevalent in consumer applications. Our email is stored in the cloud, much of our personal data is stored in the cloud, and our mobile devices commonly access information stored in the cloud. The same technology that makes these applications possible is now being harnessed for environmental management.

Julie Scopelitis

Dr. Julie Scopelitis demonstrating the Qehnelo software

Preserving natural ecosystems typically incorporates a complex balance of scientific, political, societal, and economic facts, needs and viewpoints. However, the data needed to perform the associated decision making process is often stored in physically separate locations. As a result, despite growing global connectivity, accessing and integrating this data can be a challenge, particularly in remote locations.

The cloud, or more specifically the vast network of remote servers and its associated software, is the foundation allowing access to diverse sets of data. Rather than copy and transmit copies of large volumes of data and/or software to different users, cloud computing allows users to remotely access distributed storage locations. In many cases this approach is not only more efficient, but also more democratic, allowing greater distribution of limited computing resources to a larger number of users.

An interesting example of cloud technology is Qehnelo, a web-based software product created by the New Caledonian company Bluecham. Qehnelo, whose name derives from a native phrase for “open door”, integrates remote data access with high-level decision support models. Dr. Julie Scopelitis is working on using this innovative software for coral reef monitoring and management. Through this software, Julie is able to better leverage her own expertise and ultimately put the power of advanced technology into the hands of managers, conservationists and scientists.

It is exciting to see such innovative new technology being used for coral reef management. We are sure to see this trend continue as computing resources become more affordable and more accessible.

NVIDIA Announces Future for GPU Computing

NVIDIA GPU Technology Conference – Day 2

HySpeed Computing’s president, James Goodman, is attending the 2012 NVIDIA GPU Computing Conference. He’ll be sharing his experiences, thoughts and news coming out of the convention.

Entering the second day of the NVIDIA conference, we had high expectations for the keynote address and breaking news. The keynote did not disappoint.

The main speaking hall faded to black with a soundtrack playing in the background. The room of 3,000 was buzzing with anticipation. Then, the opening keynote speaker, CEO and co-founder of NVIDIA Jen-Hsun Huang, took the stage, commanding the attention of the entire eager audience. As part of his keynote, Jen-Hsun confirmed NVIDIA’s continued commitment to GPU computing by announcing the company has yet again “doubled down” on investing in the technology. This investment became immediately apparent as he also took the opportunity to announce the launch of their most advanced GPU to date, the Kepler GPU. (Watch the keynote speech here)

With more speed, power and energy efficiency than any previous NVIDIA GPU, the Kepler replaces the Fermi and promises to fundamentally advance computer graphics and computing. Built on three new technologies – SMX, Hyper-Q and Dynamic Parallelism – the Kepler provides exceptional new computing capabilities. It is also the first ever GPU designed for a world of cloud computing. Kepler will also be used as the technology behind GeForce Grid, which enables unprecedented low-latency live streaming of cloud gaming.

Other guest speakers and demonstrations of the day, joining Jen-Hsun on stage, included Sumit Dhawan, group vice president and general manager of Citrix; Grady Cofer, visual effects supervisor at Industrial Light & Magic; David Yen, senior vice president and general manager of Cisco Data Center Group; and Dave Perry, CEO and co-founder of Gaikai.