blog authors
past blog entries

Welcome to the Kellylab blog

geospatial matters

Please read the UC Berkeley Computer Use Policy. Only members can post comments on this blog.

Entries in disaster response (49)

Thursday
Sep202012

Cool cartography--Risk mapping at a broad view

I came across this short blurb by  on some tricks for catchy large-scale maps. The bullet-points include:

  • Interesting Topic.  The subjects of these maps inherently represent risk, which we want to understand.
  • Unexpected Scope.  A forest view of something that’s usually seen at the tree-level offers satisfying perspective.
  • Big and Clear.  A single dataset is conceptually simple, and when large enough, it provides its own context-promoting conversation in the wild.
  • Sharable.  A static image is portable and paste-able, easily nestling into articles, blogs, tweets, and PowerPoints.
  • Attractive.  The currency of design buys a second or third look.

There is often a push to make large datasets available through interactive webGIS portals, but I think this makes a good case that there is still also a role for skilled cartography to present information in captivating ways. 

Below is an example of one of the author's (John Nelson) maps, and more can be found here

Thursday
Jun212012

What is "success" with post-disaster crowdsourcing?

At a recent workshop I gave on webGIS, after giving an overview of some of the recent uses of crowdsourced and VGI in disasters (fire in San Diego, earthquake in Christchurch, Ushahidi everywhere...), I was asked about success of these projects. Who used the data? How? (and who funded these websites, but that is another story.) And I had only the vaguest of answers. Here is a thoughful critique on this subject by Paul Currion on MobileActive.org. He examines the use of the Ushahidi project in Haiti. Paul is an aid worker who has been working on the use of ICTs in large-scale emergencies for the last 10 years.  He asks whether crowdsourcing adds significant value to responding to humanitarian emergencies, arguing that merely increasing the quantity of information in the wake of a large-scale emergency may be counterproductive. Why? because aid workers need clear answers, not a fire-hose of information. Information from the crowd needs to be curated, organized, targeted for response. He makes the point that since crowdsourced data is going have to be sorted through, and can be biased, and can be temporary, aid agencies are going to have to carry out exactly the same needs assessments that they would done without the crowdsourced information.

Where and when do crowdsourced data add value to a situation or project? How can we effectively deal with the bias in the data that comes naturally? We deal with this all the time in my smaller web-related projects: oakmapper and snamp for example. What is the future role of the web for adaptive forest management for example? How do these new collaborative and extensive tools help us make important decisions about natural resources management in often contentious contexts? More to think about.

Saturday
Mar242012

ASPRS 2012 Wrap-up

ASPRS 2012, held in Sacramento California, had about 1,100 participants. I am back to being bullish about our organization, as I now recognize that ASPRS is the only place in geospatial sciences where members of government, industry, and academia can meet, discuss, and network in a meaningful way. I saw a number of great talks, met with some energetic and informative industry reps, and got to catch up with old friends. Some highlights: Wednesday's Keynote speaker was David Thau from Google Earth Engine whose talk "Terapixels for Everyone" was designed to showcase the ways in which the public's awareness of imagery, and their ability to interact with geospatial data, are increasing. He calls this phenomena (and GEE plays a big role here): "geo-literacy for all", and discussed new technologies for data/imagery acquisition, processing, and dissemination to a broad public(s) that can include policy makers, land managers, and scientists. USGS's Ken Hudnut was Thursday's Keynote, and he had a sobering message about California earthquakes, and the need (and use) of geospatial intelligence in disaster preparedness.

Berkeley was well represented: Kevin and Brian from the GIF gave a great workshop on open source web, Kevin presented new developments in cal-adapt, Lisa and Iryna presented chapters from their respective dissertations, both relating to wetlands, and our SNAMP lidar session with Sam, Marek, and Feng (with Wenkai and Jacob from UCMerced) was just great!

So, what is in the future for remote sensing/geospatial analysis as told at ASPRS 2012? Here are some highlights:

  • Cloud computing, massive datasets, data/imagery fusion are everywhere, but principles in basic photogrammetry should still comes into play;
  • We saw neat examples of scientific visualization, including smooth rendering across scales, fast transformations, and immersive web;
  • Evolving, scaleable algorithms for regional or global classification and/or change detection; for real-time results rendering with interactive (on-the-fly) algorithm parameter adjustment; and often involving open source, machine learning;
  • Geospatial data and analysis are heavily, but inconsistently, deployed throughout the US for disaster response;
  • Landsat 8 goes up in January (party anyone?) and USGS/NASA are looking for other novel parterships to extend the Landsat lifespan beyond that;
  • Lidar is still big: with new deployable and cheaper sensors like FLASH lidar on the one hand, and increasing point density on the other;
  • Obia, obia, obia! We organized a nice series of obia talks, and saw some great presentations on accuracy, lidar+optical fusion, object movements; but thorny issues about segmentation accuracy and object ontology remain; 
  • Public interaction with imagery and data are critical. The Public can be a broader scientific community, or a an informed and engaged community who can presumably use these types of data to support public policy engagement, disaster preparedness and response.
Tuesday
Aug302011

Fire in the Great Dismal Swamp, VI

A nice example of remote sensing for fire: this visualization allows you to compare the utility of hyperspectral images to see through the smoke and map fire scars. The article is about a lightning strick fire in the fantastically named "Great Dismal Swamp" in Virginia. Hurricane Irene might put a damper on the fire.

“Eight inches of rain will not put the fire out,” said Tim Craig, Fire Management Officer for the refuge. “It will buy us time to clear our way through the downed trees back to the fire zone after the storm.” Irene generously drenched the swamp with 10 – 15 inches of rain, but initial assessments show that the fire is still burning. Before the storm, the Lateral West fire was 35 percent contained. Smoke still rose from at least 30 acres after the storm though open flames were no longer visible and the fire did not spread under Irene’s strong winds, said local news reports. The sudden flush of rain left puddles that are still soaking in to the soil and may yet help extinguish the fire.

See the interactive tool and article here.

Saturday
Aug272011

Interactive map of Irene: NYTimes

Here is the NYTimes live map showing Irene's progress.

Looks like it went directly over Beaufort and the NOAA Marine Lab. Hope everyone is ok.

Irene as of 8:30 PST

The NYTimes site also has some great tools showing current wind speed.

Friday
Aug262011

New York hurricane evacuation map

There are three evacuation zones in New York City that are based on the strength of the hurricane making landfall. Mayor Bloomberg has issued a mandatory evacuation of Zone A, plus areas of the Rockaways that are in Zone B, by 5 p.m. on Saturday. From the NYTimes.

Tuesday
Aug232011

Here's an earthquake map you don't see every day...

A 5.3 in Colorado, and a 5.8 in Virginia! The earlier Colorado tembler (8-23-11) was outside Trinidad, CO, and today's Virginia quake (8-23-11) was east of Charlottsville. Info from the USGS Earthquake site. Everyone at "Talk of the Nation" is freaking out on air!

Tuesday
Aug162011

Photogrammetry in action: dating the great "A trip down Market Street", 1906

Sometime before the 1906 San Francisco earthquake, a camera was attached to a streetcar travelling north along Market Street, San Francisco, and recorded the hustle and bustle, the multi-modal transportation options, and the wonderful fashions of early 19th century San Francisco. The movie, which I happend to catch last week at SFMOMA as part of their great (but too large) Stein collection, is mesmerizing. Check it out here on You Tube. It is clearly pre-earthquake, but its exact timing has not been known until now.

Ferry Building arrivalIn an article in Photogrammetric Engineering and Remote Sensing, Richard Greene narrows the window of aquisition down to between 24 March and 30 March 1906, just weeks before the earthquake on 18 April. Remember, that earthquake and the fires that followed largely destroyed much of the city. He performs this feat of timing through detailed photogrammetry: determing the time of day, the solar position, and the time of year from shadows on cornices and other architectural details.

Another windy day in the city! these cornices were helpful in determing solar positionSo cool! The article can be found here. Full reference here: 

Greene, R., 2011. Dating the fliming of "A trip down Market Street". Photogrametric Engineering & Remote Sensing 77, 839-848.

Check out some fun pics from the movie.

 

Thursday
Jul072011

A bit late, but the tornado track from Tuscaloosa, AL

NASA has released a unique satellite image tracing the damage of a monster EF-4 tornado that tore through Tuscaloosa, Alabama, on April 27th. It combines visible and infrared data to reveal damage unseen in conventional photographs.

"This is the first time we've used the ASTER instrument to track the wake of a super-outbreak of tornadoes," says NASA meteorologist Gary Jedlovec of the Marshall Space Flight Center in Huntsville, AL.

How would you map it? as a line or as a field?

Another cool image of the tornado track.

Tuesday
Jul052011

Debris from Japanese tsunami steadily drifting toward California

This item got heavy news rotation this morning: the considerable debris from the tsunami in Japan is out to sea and slowly moving toward Hawaii and the west coast of the US. 

The debris is moving east at roughly 10 miles a day, and is spread over an area about 350 miles wide and 1,300 miles long -- an area roughly the size of California. It should reach beaches and coastal cities in California, Oregon and Washington in 2013 or early 2014. These estimates are from a computer model, the details of which are spotty in the articles I read. Example here from insidebayarea.

Debris movement similation: purple is low density, red is high density of debrisThere is considerable concern about this.  Last Monday, representatives from the Coast Guard, NOAA, the Environmental Protection Agency, the U.S. State Department and other agencies met for the first time in Honolulu to share information about the Japanese debris and begin to chart a strategy.

Among their plans: to notify the U.S. Navy and commercial shipping companies that regularly sail across the Pacific so they can begin to document what is floating. That could lead to expeditions to go map and study it.

Curtis Ebbesmeyer, a Seattle oceanographer who has studied marine debris for more than 20 years (and done some neat work with rubber duckies to map ocean currents) is one of the leads interviewed for the report.