blog authors
past blog entries

Welcome to the Kellylab blog

geospatial matters

Please read the UC Berkeley Computer Use Policy. Only members can post comments on this blog.

Entries in data (97)

Tuesday
Dec022014

NASA NEX wins the 2014 HPCwire Readers' and Editors' Choice Award

Congratulations to the NASA NEX Team! They have won the 2014 HPCwire Readers’ & Editors’ Choice Award for the Best Data-Intensive System (End User focused).  See the article here: NASA Earth Exchange (NEX) Platform supports dozens of data-intensive projects in Earth sciences.

The NASA Earth Exchange (NEX) platform supports dozens of data-intensive projects in Earth sciences, bringing together supercomputers and huge volumes of NASA data, and enabling scientists to test hypotheses and execute modeling/analysis projects at a scale previously out of their reach. NEX-supported applications range from modeling El Niño, creating neighborhood-scale climate projections, assisting in crop water management, and mapping changes in forest structure across North America, to mapping individual tree crowns at continental scale as a foundation for new global science at unprecedented spatial resolution. NEX’s OpenNEX challenge ties in to White House initiatives, including Open Data, Big Data and Climate Data, which advance national goals to address climate change impacts and include competitions and challenges to foster regional innovation.

The GIF has been partnering with NASA NEX, and developing a framework to bring NEX data and analytical capabilities into HOLOS.

Monday
Sep292014

High resolution free DEM data released for Africa

SRTM 3 Arc-Second (approx. 90m) SRTM 1 Arc-Second (approx. 30m) Landsat 7 December 17, 2000

Just in time for class on topography and rasters tomorrow: new high res shuttle DEM data is being released for Africa. The image above shows the Niger River Delta in 90m res, 30m res, and landsat.

From the press release: In Africa, accurate elevation (topographic) data are vital for pursuing a variety of climate-related studies that include modeling predicted wildlife habitat change; promoting public health in the form of warning systems for geography and climate-related diseases (e.g. malaria, dengue fever, Rift Valley fever); and monitoring sea level rise in critical deltas and population centers, to name just a few of many possible applications of elevation data.

On September 23, the National Aeronautics and Space Administration (NASA), the National Geospatial-Intelligence Agency (NGA), and the U.S. Geological Survey (USGS, a bureau of the U.S. Department of the Interior) released a collection of higher-resolution (more detailed) elevation datasets for Africa. The datasets were released following the President’s commitment at the United Nations to provide assistance for global efforts to combat climate change. The broad availability of more detailed elevation data across most of the African continent through the Shuttle Radar Topography Mission (SRTM) will improve baseline information that is crucial to investigating the impacts of climate change on African communities.

Enhanced elevation datasets covering remaining continents and regions will be made available within one year, with the next release of data focusing on Latin America and the Caribbean region. Until now, elevation data for the continent of Africa were freely available to the public only at 90-meter resolution. The datasets being released today and during the course of the next year resolve to 30-meters and will be used worldwide to improve environmental monitoring, climate change research, and local decision support. These SRTM-derived data, which have been extensively reviewed by relevant government agencies and deemed suitable for public release, are being made available via a user-friendly interface on USGS’s Earth Explorer website.

Nice slider comparing the 90m to the 30m data here.

Wednesday
Apr162014

Cropland Data Layer (CDL) and National Land Cover Dataset (NLCD): new versions released this year

Both the NASS Cropland Data Layer (CDL) and the National Land Cover Dataset (NLCD) released new versions in early 2014. Links for download are here:

Tuesday
Apr082014

NLCD released and webinar April 15 2014

These three panels of cyclical data (2001, 2006, 2011) from the National Land Cover Database depict intervals of land cover change in the vicinity of Spring Valley, a suburb of Las Vegas, NV.[Source: USGS] Just released, the latest edition of the nation’s most comprehensive look at land-surface conditions from coast to coast shows the extent of land cover types from forests to urban areas. The National Land Cover Database (NLCD 2011) is made available to the public by the U.S. Geological Survey and partners.

Dividing the lower 48 states into 9 billion geographic cells, the massive database provides consistent information about land conditions at regional to nationwide scales. Collected in repeated five-year cycles, NLCD data is used by resource managers and decision-makers to conduct ecosystem studies, determine spatial patterns of biodiversity, trace indications of climate change, and develop best practices in land management.

Based on Landsat satellite imagery taken in 2011, NLCD 2011 describes the land cover of each 30-meter cell of land in the conterminous United States and identifies which ones have changed since the year 2006. Nearly six such cells — each 98 feet long and wide — would fit on a football field. Land cover is broadly defined as the biophysical pattern of natural vegetation, agriculture, and urban areas. It is shaped by both natural processes and human influences. NLCD 2011 updates the previous database version, NLCD 2006.

Webinar about the release will be Tuesday, April 15, 2014, 2:00 PM Eastern Time: "New Version of the National Land Cover Database - April 4, 2014 Release”

The latest version of the National Land Cover Database (NLCD) for the conterminous United States will be publicly released on April 4, 2014.  NLCD 2011 is the most up-to-date and extensive iteration of the National Land Cover Database, the definitive Landsat-based, 30-meter resolution land cover database for the Nation.  NLCD 2011 products are completely integrated with those of previous versions (2001, 2006), providing a 10-year record of change for the Nation.  Products include 16 classes of land cover, the percent of imperviousness in urban areas, and the percent of tree canopy cover. NLCD is constructed by the 10-member federal interagency Multi-Resolution Land Characteristics (MRLC) Consortium.   This seminar will highlight the new features of NLCD 2011 and the related applicationsCollin Homer, 605-594-2714, homer@usgs.gov)
For more information and to download NLCD data, visit http://www.mrlc.gov/.
Please click the following link to join the webinar:
 https://usgs.webex.com/usgs/j.php?ED=279876177&UID=490357047&RT=MiM3

At start time of the webinar, each location must call one of the dial-in numbers:
From the National Center in Reston, dial internally x4848
From all other USGS/DOI locations, dial 703-648-4848
From non DOI locations, dial toll free 855-547-8255
After the voice prompt, please enter the Conference Security Code 73848024 followed by the # key. You will hear a tone confirming that you have successfully joined the conference call. If you weren't successful, you will hear another voice prompt with instructions.

Thursday
Jan092014

Big Data for sustainability: an uneven track record with great potential

An interesting position piece on the appropriate uses of big data for climate resilience. The author, Amy Luers, points out three opportunities and three risks.

She sums up:

"The big data revolution is upon us. How this will contribute to the resilience of human and natural systems remains to be seen. Ultimately, it will depend on what trade-offs we are willing to make. For example, are we willing to compromise some individual privacy for increased community resilience, or the ecological systems on which they depend?—If so, how much, and under what circumstances?"

Read more from this interesting article here.

Friday
Nov222013

The evolution of a Digital Earth 

In 1998 Al Gore made his now famous speech entitled The Digital Earth: Understanding our planet in the 21st Century. He described the possibilities and need for the development of a new concept in earth science, communication and society. He envisioned technology that would allow us "to capture, store, process and display an unprecedented amount of information about our planet and a wide variety of environmental and cultural phenomena.” From the vantage point of our hyper-geo-emersed lifestyle today his description of this Digital Earth is prescient yet rather cumbersome: 

"Imagine, for example, a young child going to a Digital Earth exhibit at a local museum. After donning a head-mounted display, she sees Earth as it appears from space. Using a data glove, she zooms in, using higher and higher levels of resolution, to see continents, then regions, countries, cities, and finally individual houses, trees, and other natural and man-made objects. Having found an area of the planet she is interested in exploring, she takes the equivalent of a "magic carpet ride" through a 3-D visualization of the terrain.”

He said: "Although this scenario may seem like science fiction, most of the technologies and capabilities that would be required to build a Digital Earth are either here or under development. Of course, the capabilities of a Digital Earth will continue to evolve over time. What we will be able to do in 2005 will look primitive compared to the Digital Earth of the year 2020. In 1998, the necessary technologies were: Computational Science, Mass Storage, Satellite Imagery, Broadband networks, Interoperability, and Metadata. 

He anticipated change: "Of course, further technological progress is needed to realize the full potential of the Digital Earth, especially in areas such as automatic interpretation of imagery, the fusion of data from multiple sources, and intelligent agents that could find and link information on the Web about a particular spot on the planet. But enough of the pieces are in place right now to warrant proceeding with this exciting initiative.” 

Example from NOAA's Science on a Sphere projectMuch has changed since he gave his talk, obviously. We have numerous examples of Virtual Globes for data exploration - for example, Google Earth, NASA’s WorldWind, ESRI’s ArcGIS Explorer, Bing Maps 3D, TerraExplorer, Marble.  (These virtual examples are made tangible with NOAA's terrific Science on a Sphere project.)

We also have realized a new vision of the Digital Earth that includes much more than immersive viewing of data. Today’s Digital Earth vision(s) include analytics and expertise for solving problems that are often cross-discplinary and large scale. Additionally, we make much more use today than was anticipated in 1998 from sensor networks and the geoweb (e.g. volunteered geographic information and croudsourcing). Examples of this multi-disciplinary Digital Earth concept include Google Earth Engine (and its recent forest loss product), Nasa Earth Exchange, and our own HOLOS.

NSF has adopted this concept for their Earth Cube concept. Last year NSF was looking for transformative concepts and approaches to create integrated data management infrastructures across the Geosciences. They were interested in the multifaceted challenges of modern, data-intensive science and education and envision an environment where low adoption thresholds and new capabilities act together to greatly increase the productivity and capability of researchers and educators working at the frontiers of Earth system science. I am not sure if this will be funded in 2014, but the concept reafirms that the concept of the Digital Earth is widespread and will likely be an important part of academia.

Thursday
Nov142013

NASA shares satellite and climate data on Amazon’s cloud

 

NASA has announced a partnership with Amazon Web Services that the agency hopes will spark wider collaboration on climate research. In an effort that is in some ways parallel to Google's Earth Engine, NASA has uploaded terabytes of data to Amazon's public cloud and made it available to the anyone. 

Three data sets are already up at Amazon. The first is climate change forecast data for the continental United States from NASA Earth Exchange (NEX) climate simulations, scaled down to make them usable outside of a supercomputing environment. The other two are satellite data sets—one from from the US Geological Survey's Landsat, and the other a collection of Moderate Resolution Imaging Spectroradiometer (MODIS) data from NASA's Terra and Aqua Earth remote sensing satellites.

More Here

Tuesday
Nov122013

New Berkeley Institute for Data Science Launched!

UC Berkeley is establishing a new institute to enable university researchers to harness the full potential of the data-rich world that today characterizes all fields of science and discovery. The Berkeley Institute for Data Science (BIDS) will be part of a multi-million dollar effort supported by the Gordon and Betty Moore Foundation and the Alfred P. Sloan Foundation.

The new 5-year, $37.8 million initiative was announced today at a meeting sponsored by the White House Office of Science and Technology Policy (OSTP) focused on developing innovative partnerships to advance technologies that support advanced data management and data analytic techniques.

The ambitious Moore/Sloan partnership, which also includes New York University and the University of Washington, will spur collaborations within and across the three campuses and other partners pursuing similar data-intensive science goals. The three PIs who lead the respective campus efforts – Saul Perlmutter at UC Berkeley, Ed Lazowska at the University of Washington, and Yann Le Cunn at NYU – will promote common approaches to form the basis for ongoing collaboration between the three campuses.

To provide a home for the new Berkeley Institute for Data Science UC Berkeley has set aside renovated space in a historical library building on the central campus in 190 Doe Library. The Institute is expected to move into its new quarters in spring 2014. In order to help address challenges related to creating and sustaining attractive career paths the new Institute will offer new Data Science Fellow positions for faculty, post-doctoral fellows, and staff to be shared with departmental partners across the campus. The new Institute will also offer support for graduate students, and organize short courses, boot camps, hack-a-thons and many other activities.

More information about specific BIDS programs will be forthcoming in the coming weeks. The new Institute will be launched at a campus event on December 12, 2013. If you or your students and collaborators are interested in participating in the Data Science Faire that day, please be sure to register at http://vcresearch.berkeley.edu/datascience/dec12-registration. The deadline is November 25, 2013.

For updates and more information, please visit http://vcresearch.berkeley.edu/datascience/overview-data-science and contact data science@berkeley.edu with any questions you may have.

Tuesday
Oct082013

California Geoportal Offers One-Stop Shop for Statewide GIS Data

The California Geoportal, officially launched in March 2013 (see here for related launch press release), augments and in some ways replaces the original Cal-Atlas statewide GIS data download webpage with a more simplified, smooth, and more intuitive website for all GIS related data in the state. You can now search or browse for GIS data by geography and any corresponding metadata using traditional search queries as well as by using a standalone webGIS interface. The portal also provides direct download links to some Oregon and Nevada state GIS datasets. The site acts as a GIS data repository for publicly available GIS data and related documents and maps from state agencies and local and regional governments. Rather than hosting the physical data, the site instead acts as a library of direct download links to datasets that connect directly to the author’s databases. The site also links you to other state GIS applications such as the California Coastal Geoportal and webGIS viewers from various state agencies.

Screenshot of the CA Geoportal

Screenshot of the CA Geoportal Map ViewerSee below for an informative video on how and why the portal was created and for highlights of features:

Tuesday
Jun182013

Landsat 8 imagery available

From Kelly:

Data collected by the Operational Land Imager (OLI) and the Thermal Infrared Sensor (TIRS) onboard the Landsat 8 satellite are available to download at no charge from GloVis, EarthExplorer, or via the LandsatLook Viewer

Orbiting the Earth every 99 minutes, Landsat 8 images the entire Earth every 16 days in the same orbit previously used by Landsat 5. Data products are available within 24 hours of reception. Check it.