Planet Geospatial

GeoServer BlogGeoServer Sunday Sprint (FOSS4G)

Back in March we had a Save the Date post for FOSS4G 2014 in Portland. Check your travel plans include participating in the Code Sprint at the end of the conference.

The code sprint is an opportunity for the team to get together and work on “tough problems without funding”. For GeoServer we have two candidates:

  1. Update to a recent version of Wicket to improve browser compatibility
  2. Update CITE Conformance Tests

GeoServer is extending the code sprint to include:

  • Saturday, September 13th: FOSS4G is providing facilities at WhereCampPDX.
  • Sunday, September 14th: Boundless is arranging facilities at nedspace from 10am-4pm.

To attend add your name to OSGeo wiki page and we will look forward to seeing you in Portland!

NedSpace

NedSpace

 

Suite 250

Suite 250, SW 11th Avenue, Portland

Thanks to Mike Pumphrey for arranging the venue for the Sunday Sprint.

 

 

 

GIS LoungeNational Transportation Atlas Database

The Bureau of Transportation Statistics recently released the 2014 National Transportation Atlas Database.   From the BTS The DVD is a set of nationwide geographic databases of transportation facilities, transportation networks, and associated infrastructure. These datasets include spatial information for transportation modal networks and intermodal terminals, as well as the related [...]

The post National Transportation Atlas Database appeared first on GIS Lounge.

GIS LoungeEvaluating Ecosystems from Space

The impact of humans upon the planet’s ecosystems has long been a concern for scientists, and now the European Space Agency (ESA) is taking their assessment of natural resources to a new level with satellite technology. The emphasis of a new project from the ESA is on promoting sustainability and [...]

The post Evaluating Ecosystems from Space appeared first on GIS Lounge.

All Points BlogUpdate 2: Esri Announces MOOC: Going Places with Spatial Analysis

More info via various sources. Suggestion to Esri: It'd sure be nice if all of this information was on the MOOC home page. Start date: Sept 3 (via Learn ArcGiS) MOOC guest lecturers: (via @esrimooc):   David Gadsden - Esri Nonprofit Program coordinator. Geographer. Humanitarian.... Continue reading

thinkwhereNorth bar stone uncovered

Via Penny Goodman’s tweet about a Secret Leeds Forum post on the North Bar Stone.  Here’s a photo of the Beating the Bounds walk around the stone for Terminalia Festival 2014 and the photo of the stone today. wpid-IMG_20140223_162651-500x375 IMG_1898 IMG_1895


AnyGeoNew from Google – Maps Coordinate + Maps Engine Pro

TweetGoogle has this week announced that Google Coordinate will now be included with every Maps Engine Pro subscription ($5/user/month). Recall, Google Maps Coordinate, is the mobile and web app that lets teams assign jobs and share their locations with each … Continue reading

LiDAR NewsOregon Cities Partner on LiDAR Data Collection

DOGAMI uses Quantum Spatial, a geo-mapping company with a Portland office, to gather the data. Continue reading →

Click Title to Continue Reading...

Directions MagazineURISA Board of Directors Election Results Announced

All Points BlogGIS Health News Weekly: Animal and People Pathogens

Map Pathogens in Man and Animal for Prevention Researchers at the University of Liverpool's Institute of Infection and Global Health are building the world's most comprehensive database describing human and animal pathogens. Called the Enhanced Infectious Diseases (EID2) database it... Continue reading

GeoServer BlogGeoServer 2.6-beta Released

The GeoServer team is overjoyed to announce the release of GeoServer 2.6-beta.

I hope you are enjoying the new website – the download page for 2.6-beta provides links to the expected zip, war, dmg and exe bundles. For this release we are experimenting with providing source downloads directly from the GitHub 2.6-beta tag.

As a development release, 2.6-beta is considered experimental and is provided for testing purposes. This release is not recommended for production (even if you are excited by the new features).

This release is made in conjunction with GeoTools 12-beta. Thanks to Kevin for making a beta release of GeoWebCache 1.6.0-beta with relatively little notice.

What to expect … and what we expect from you

A complete change log is available from the issue tracker. We will ask you to wait for 2.6.0 before we let Andrea write a pretty blog with pictures illustrating what features have been added. Instead 2.6-beta is my chance to ask you to download GeoServer 2.6-beta for testing.

Testing is a key part of the open source social contract. The GeoServer team have identified a few areas where we would like to ask for help. This is your best chance to identify issues early while we still have time to do something about it. For those making use of commercial support ask your vendor about their plans for 2.6-beta testing. We would like to ensure the functionality you depend on is ready to go for a Q2 release.

When testing Geoserver 2.6-beta please let us know on the user list (or #GeoServer) how it works for you. We will be sure to thank you in the final release announcement and product presentations.

Java 7 Testing

With Oracle retiring Java 6 security updates the time has come to raise the minimum bar to Java 7.

We know a lot of downstream projects (such as OSGeo Live) have been waiting for GeoServer to support Java 7. Thanks to CSIRO, Boundless, GeoSolutions for providing Java 7 build environments allowing us to make this transition in a responsible fashion.

Testing:

  • This is a major testing priority on all platforms.
  • Windows 7: The start.bat used by the run manually install has trouble running as an administrator. We recommend installing as a service of this release (GEOS-5687)
  • Mac: You will need to install Oracle Java 7 (as OpenJDK 7 is not yet available for OSX). We have not yet figured out how to run GeoServer.App with Java 7 (GEOS-6588) and are open to suggestions.

References:

WFS Cascade

This is a really exciting change, swapping out our gt-wfs client code for a new gt-wfs-ng implementation with a new GML parser / encoder.  After comparing quality of the two implementations we decided to go all in with this transition .. and thus would really like your help testing.

We would like to hear back on cascading the following configurations:

  • GeoServer
  • deegree
  • MapServer
  • tinyows - there is a critical fix about axis order in tinyows trunc. It corrects (finally!) the output … but perhaps not yet the filters?
  • ArcGIS
  • Other – any other WFS you are working with!

Testing:

  • Pay special attention to the flags used for axis order. There are different flags to account for each way a WFS implementation can get confused. You will find some implementations expect the wrong axis order on request, but are capable of producing the correct axis order output.
  • We especially ask our friends in Europe to test WFS services published for INSPIRE compliance

This was an epic amount of work by Niels and we have a couple of new features waiting in the wings based on the success of this transition.

Curves support for GML and WMS

A large amount of work has been put into extending the Geometry implementation used by GeoServer.

We have experimented with several approaches over the years (including ISO 19107 and a code sprint with the deegree project) and it is great to finally have a solution. As a long time user of the JTS Topology Suite we have been limited to a Point, Line and Polygon model of Geometry. Andrea has very carefully extended these base classes in to allow for both GML output and rendering. The trick is using a tolerance to convert the the arcs and circles into line work for geometry processing.

Testing for the 2.6-beta release is limited to those with Oracle Spatial. If you are interested in funding/volunteering support for PostGIS please contact the geoserver-devel email list.

Testing:

  • Look for “Linearization tolerance” when configuring your layer.

Advanced projection handling for raster

We would like to hear feedback on how maps that cross the date line (or are in a polar projection) have improved for you.

Testing:

  • No special settings needed

Reference:

Coverage Views

We struggled a bit with how to name this great new feature, however if you work with raster data this is your chance to recombine bands from different sources into a multi-band coverage.

Testing:

  • Use “Configure new Coverage view” when creating a new layer

References:

Startup Testing

Yes this is an ominous item to ask you to test.

GeoServer 2.6 simplifies where configuration files are stored on disk. Previous versions were willing to spread configuration files between the webapps folder, the data directory and any additional directories on request. For GeoServer 2.6 configuration files are limited to the data directory as a step towards improving clustering support and growing our JDBC Config story.

Testing:

  • No special settings needed
  • Special request to check files that are edited by hand on disk (such security settings and free marker templates)

References:

Pluggable Styles

For everyone happy with the CSS Style Extension we would like to ask you to test a change to the style edit page (allowing you to create a CSS or SLD style from the start).

Testing:

  • Install CSS Extension and look for a new option when creating a style

Reference:

Wind barbs and WKT Graphics

I am really happy to see this popular extension folded into the main GeoServer application.

Testing:

  • Check GeoTools WKT Marks for examples you can use in your SLD file

References:

New Formats and Functionality

We have new implementations of a couple of modules:

  • Printing – new implementation from our friends at MapFish
  • Scripting – includes a UI for editing scripts from the Web Administration Application

A final shout out to ask for help testing new formats:

  • NetCDF
  • Rib
  • OGR

About GeoServer 2.6

Articles and resources for GeoServer 2.6 series:

Between the PolesCommercial-scale carbon capture and sequestration project begins construction

About 37% of U.S. electric power generation is from coal-fired power plants.  New EPA regulations scheduled to come into effect July 1, 2015 will restrict CO2 emissions from power plants.  Coal-fired power plants will have to implement some form of carbon capture and sequestration (CCS) , convert to natural gas or some other cleaner fuel, or shut down.   CCS takes two forms, pre-combustion or post-combustion.  I blogged about a pre-combustion commercial CCS implementation in Kentucky that began operation earlier this year.

CCS amine-based CO2 removalConstruction of the first U.S. commercial-scale post-combustion carbon capture and sequestration (CCS) retrofit has begun.  Post-combustion CCS technology will be installed at the coal-fired 240 MW Parish Generating Station in Houston, Texas.

The Petra Nova project will capture 90 % of the plant’s CO2 emissions through an advanced amine-based process.  The process was piloted in a three-year project in Alabama.  The CO2 capture rate will result in lower greenhouse gas emissions than from a traditional natural gas-fired power plant.   The process involves scrubbing the flue gases with an amine solution, to form an amine–CO2 complex, which is then decomposed by heat to release high purity CO2. The regenerated amine is recycled to be reused in the capture process. The CO2 capture and compression system will be powered by a cogeneration plant comprised of a combustion turbine and heat recovery boiler.  The oil field will be monitored to verify that the CO2 remains underground.

The captured CO2 will be compressed and transported via an 80-mile pipeline to increase oil output from an oil field with declining production.   After sseparation from the oil, the CO2 will be injected underground for permanent sequestration. 

LiDAR NewsLiDAR Used to Quantify Carbon Stored in Hedgerows

Secondly, it also quantifies one of the key ecosystem services of hedgerows in taking up carbon dioxide and storing it as biomass. Continue reading →

Click Title to Continue Reading...

geoMusingsLock-In

I’ve been a consultant/programmer/integrator/other for over twenty years now. That’s not quite long enough to say I’ve seen it all but long enough to notice a few patterns. Admittedly, I’ve spent the vast majority of that time working in the defense world so the patterns may be heavily skewed to that but I think not.

I’ve run across a number of well-entrenched government-developed systems, such as command-and-control systems, with user interfaces and experiences that many professional designers would consider abhorrent. Yet, I have watched smart, motivated men and women in uniform stand in front of working groups and committees dedicated to “improving” systems and workflows and advocate passionately for these seemingly clunky systems.

Why? Because they know how to use these systems inside and out to meet their missions. User experience is ultimately about comfort and confidence. A user that is comfortable with a system will have a great experience with it regardless of its appearance. DOD tackles this reality through training. For all its faults, there is still no organization better at developing procedures and thoroughly training its people in them. It results in a passionate loyalty for the tools that help them do their jobs and places a very high hurdle in front of any tools that seek to replace current ones.

This experience has given me a different view of the concept of “lock-in.” Over my career, I have heard this term used in a pejorative sense, usually proceeded by the word “vendor.” Although I have used the term myself, I usually hear it levelled by a vendor’s competitors. It is typically meant to refer to practices a vendor uses to establish barriers to exit for its customers, making it harder for them to choose a competing technology. Such practices can include artificial bundling of unrelated tools, license trickery, half-truths in marketing, and many more; all of which do happen.

Lock-in is a real thing. Lock-in can also be a responsible thing. The organizations I have worked with that make the most effective use of their technology choices are the ones that jump in with both feet and never look back. They develop workflows around their systems; they develop customizations and automation tools to streamline repetitive tasks and embed these in their technology platforms; they send their staff to beginning and advanced training from the vendor; and they document their custom tools well and train their staff on them as well. In short, they lock themselves in.

This is the right and responsible thing to do. An organization, once it has selected a technology, has a responsibility to master it and use it as effectively as it can. If you start applying numbers to all the activities listed above, you will quickly see that it is an investment that far outstrips the original investment in the technology itself. In fact, the cost of the technology itself is often seen as marginal to the overall lifecycle cost, which makes arguments about removing licensing costs, for example, less effective than they would appear to be.

This is true regardless of the provenance of the technology. The original technology has to start to become a hindrance before change is seriously considered, which I am seeing in a few cases these days. But, by and large, the very strong pattern I have seen is that the majority of lock-in originates with users. To fail to recognize that and continue to target the vendor is to miss the point and, ultimately, the target.

Directions MagazineOGC calls for comment on candidate Moving Feature Encoding standard

Directions MagazineTomTom Telematics Tops 400,000 Vehicles Subscribed to Its Software as a Service

Directions MagazineFormer Super Bowl Champion Running for Congress to Keynote MAPPS Summer Conference

Directions MagazineHERE is the Official Map of Red Bull

All Points BlogTomTom to Offer Faster Updates of Map Database

TomTom's Multinet-R platform promises to deliver faster updates to clients using a more narrowly constrained quality assurace process. The QA process will utilize faster validation leveraging crowd-sourced data. The objective is to get new data into the hands of clients with updates... Continue reading

Directions MagazineCall for Chapters:  STEM and GIS in Higher Education

thinkwhereLeeds Creative Labs – Initial steps and ideas around The Hajj

Cross posted from The Leeds Creative Labs blog.

I signed up to take part in Leeds Creative Labs Summer 2014 programme with the hope that it would result in something interesting, something that a techie would never get the opportunity to do normally. It’s certainly exceeded that expectation – it’s been a fascinating enthralling process so far, and I feel honoured to have been selected to participate.

 

I’m the designated “technologist” who is in partnership with Dr Seán McLoughlin and Jo Merrygold on this project around The Hajj and British Muslims. Usually I tend to do geospatial collaborative and open data projects, although I’m also a member of the Leeds group of Psychogeographers. Psychogeography is intentionally vague to describe but one definition is that it’s about the feelings and effects of space and place on people. It’s also about a critique of space – a way to see how modern day consumerism/capitalism is changing how our spaces are, and by definition how we in these spaces behave.

We had our first meeting last week – it was a “show and tell” by Seán and Jo to share some of the ideas, research, themes and topics that could be of relevance to what we will be doing.

Show and tell

Seán, from the School of Philosophy, Religion and The History of Science introduced his research on Islam and Muslim culture, politics and society in contexts of contemporary migration, diaspora and transnationalism. In particular his work has been around and with South Asian heritage British Muslim communities. The current focus of his work, and the primary subject of this project is about researching British Muslim pilgrims’ experiences of the Hajj.

The main resources are audio interviews, transcripts and on-line questionnaires from a number of different sources such as pilgrims of all ages and backgrounds, other people related to the Hajj “industry” such as tour operators and charities.

Towards the end of the year are a few set days for the Hajj – a once in a lifetime pilgrimage to the holy Saudi Arabian city of Mecca. You have probably seen similar photos such as this where thousands of pilgrims circle the Kaaba – the sacred cuboid house right in the centre of the most sacred Muslim mosque.

It’s literally the most sacred point in Islam. It’s the focal point for prayers and thoughts. Muslims orient themselves towards this building when praying. The place is thought about everywhere – for example, people may have paintings with this building in their homes in the UK, and they may bring back souvenirs of their Hajj pilgrimage . You can see that the psychogeography of space and place on the emotions and thoughts of people could be very applicable here!

And yet the Hajj itself is more than just about the Kaaba – it’s a number of activities around the area. Here’s a map!

The Hajj

These activities, all with their own days and particular ways of doing them are literally meant to be in the footsteps of key religious figures in the past. I will let the interested reader to discover for themselves, but there’s a number of fascinating issues surrounding the Hajj for British Muslims with Seán outlined.

Here’s a small example of some of these themes:

Organising the Hajj (tour operators, travel etc).
What the personal experiences of the pilgrims were.
How Mecca has changed, and how the Hajj has changed.
The commercial, the profane, the everyday and the transcendent and the sacred.
How this particular location and event works over time and space.
What are the differences and similarity of people and cultures, and possible experiences of poverty.
“Hajj is not a holiday” and Hajj Ratings.
Differences in approach of modern British Muslims to going to the Hajj (compared to say their grandparents).
Returning home and the meaning and expectations of returnees (called Hajjis).
What we did and didn’t do

We didn’t rush to define our project outputs – but we all agreed that we wanted to produce something!

Echoing Maria’s post earlier we are trying to leave the options open for what we hope to do. Allowing our imaginations to run and to explore options. I think this justice to the concept of experimentation and collaboration, and should help us be more creative. I think that we can see which spark our imaginations, what address the issues better – what examples and existing things are out there that can be re-appropriated or borrowed, and which things point us in the right direction.

What I did after

So after the show and tell my mind was spinning with new ideas and concepts. It took me a few days to go over the material and do some research of my own, and see what sorts of things I might be able to contribute to. It’s certainly sparked my curiosity!

I was to prepare for a show and tell (an ideas brain-dump) for the next meeting. The examples I prepared included things from cut and paste transcriptions, 3D maps, FourSquare and social media, to story maps, to interactive audio presentations and oral history applications. I also gave a few indications as to possible uses of psychogeography with the themes. I hope to use this blog to share some of these ideas in later posts.

Initially I mentioned the difference between a “hacker” approach and the straight client and consultant way of doing development. For example encouraging collaborative play and exploration rather than hands off development. Allowing things to remain open. The further steps would be crystallizing some of these ideas – finding better examples and working out what we want to look at or devote more time to. We’d then be able to focus on some aims and requirements for a creative interesting project.


All Points BlogGIS Education News Weekly: Geotech in PhysEd, TSA Geographic Literacy, Map Projections

Geospatial in Phys Ed? Danielle Grant, a Potsdam, NY elementary school physical education teacher has been named the 2014 New York State Physical Education Teacher of the Year. She's high tech: Grant uses pedometers, pulse sticks, GPS units, gaming systems, iPads, PowerPoint, the... Continue reading

All Points BlogMOOC from Open Online Academy Starts this Fall: Introduction to GIS and Mapping

Introduction to GIS and Mapping starts this fall from the Open Online Academy. The course page suggests it's either four weeks or eight; it's not clear. Until recently cartography skills where hard to learn. Today, new technologies allow us to download geographical data, use... Continue reading

Geoinformatics TutorialReading and Map projecting raster data with geolocation arrays using gdal

The following describes how to geocode a raster band, whose the geolocation information is stored in one channel containing latitude values for each pixel, and another channel containing longitude values for each pixel. Opening the image, you see the unprocessed orbital swath.


Information on the hdf image can be retrieved with gdalinfo:

gdalinfo C:\Users\max\Documents\GW1AM2_201301311114_050A_L1SGBTBR_1110110.h5

The metadata printed out contains the names of the subchannels, as seen here:
and information about a specific channel can be retrieved by using gdalinfo on exactly this subchannel-name from the metadata:

gdalinfo HDF5:"C:\Users\max\Documents\GW1AM2_201301311114_050A_L1SGBTBR_1110110.h5"://Brightness_Temperature_(89.0GHz-A,V)

The tricky part is now, to transfer such raster bands into a map projection.

gdalwarp can  read separate latitude and longitude rasters, which is called geolocation arrays in gdal. the comment on this page brought me a step further although I think that the vrt-code and the commands given are not completely correct. One has to create a vrt file describing the image raster, also referring to the latitude and longitude bands. Then in addition a vrt file describing the latitude band and a vrt file describing the longitude band.

So I create a vrt-file (see here for info) named GW1AM2_201301311114_050A_L1SGBTBR_1110110.vrt:



C:\Users\max\Documents\lon.vrt
1
C:\Users\max\Documents\lat.vrt
1
0
0
1
1




HDF5:GW1AM2_201301311114_050A_L1SGBTBR_1110110.h5://Brightness_Temperature_(89.0GHz-A,H)
1









Then a file named lon.vrt :





HDF5:GW1AM2_201301311114_050A_L1SGBTBR_1110110.h5://Longitude_of_Observation_Point_for_89A
1








Finally a file named lat.vrt :




HDF5:GW1AM2_201301311114_050A_L1SGBTBR_1110110.h5://Latitude_of_Observation_Point_for_89A
1








With these files created and placed in the same directory, I can use gdalwarp, but note that you are calling the vrt file rather than the hdf file!
gdalwarp -geoloc -t_srs EPSG:4326 C:\Users\max\Documents\GW1AM2_201301311114_050A_L1SGBTBR_1110110.vrt C:\Users\max\Documents\test100.tif


The result is the following swath:


In order to get it onto the polar stereographic projection, in my case the NSIDC EPSG:3411 projection, I first subset the image:


gdal_translate -projwin -94 90 40 35 C:\Users\max\Documents\test100.tif C:\Users\max\Documents\test101.tif


which results in this image:



now I reproject into EPSG:3411:

gdalwarp -s_srs EPSG:4326 -t_srs EPSG:3411 C:\Users\max\Documents\test101.tif C:\Users\max\Documents\test102.tif

resulting in this image:

Geoinformatics TutorialReading AMSR-2 Data into GeoTIFF raster using gdal_grid

The passive microwave data from the Advanced Microwave Scanning Radiometer (AMSR), available at https://gcom-w1.jaxa.jp/auth.html, is delivered in orbital swaths. The raster band contain the image data in various frequency range, and the geolocation information for 89GHz is stored in one containing latitude values for each pixel, and another channel containing longitude values for each pixel. This data needs processing before having a regular geocoded raster.

The following is not an exhaustive description, but extended notes on how to read AMSR-2 files, which possibly may be of   help to others trying to solve a similar task.

For some, the final commented script at our github page may be enough help, the text below tries to explain the most important steps in this script:

The various frequency bands of AMSR-2 have different resolution (see the product specs and the user manual ), we choose the L1R dataset, where the data is already processed to match the geolocation stored in the lat/long band for 89GHz.

Information on the hdf image can be retrieved with gdalinfo:

gdalinfo C:\Users\max\Documents\GW1AM2_201301311114_050A_L1SGBTBR_1110110.h5

The metadata printed out contains the names of the subchannels, as seen here:
and information about a specific channel can be retrieved by using gdalinfo on exactly this subchannel-name from the metadata:

gdalinfo HDF5:"C:\Users\max\Documents\GW1AM2_201301311114_050A_L1SGBTBR_1110110.h5"://Brightness_Temperature_(89.0GHz-A,V)

Opening one of the bands in QGIS, the data looks like this (all images in this post are © JAXA EORC ):


In this example we want to convert the data into the NSIDC sea ice raster (EPSG:3411) covering the Arctic areas, using the gdal_grid utility, which creates a regular grid out of a point collection.

Important to know: The 89GHz channel is divided into 89A and 89B, and only both together give the full resolution of about 5km. Each 89 GHz channel has an associated latitude and a longitude raster giving the coordinates for each pixel. For the L1R product, the geolocation information of other frequencies with lower resolution can be derived from the 89A longitude and latitude raster by using only their odd columns.

In the first step, I unzip the gz - zipfiles, then open the hdf file (*.h5). The various frequency bands are stored in Subdatasets, so you open the hdf file with gdal.open(), but then use again gdal.open() for a subdataset (for information on the bands you can run gdalinfo on the *.h5 files:


HDFfile = gdal.Open( r'C:\Users\max\Documents\GW1AM2_201301311114_050A_L1SGBTBR_1110110.h5' )

HDF_bands = HDFfile.GetSubDatasets()

#HDF Subdatasets are opened just as files are opened:
HDF_SubDataset = gdal.Open(HDF_bands[channel][0])
HDF_SubDataset_array = HDF_SubDataset.ReadAsArray()

HDF_Lat89A = gdal.Open(HDF_bands[46][0])
HDF_Lat89A_array = HDF_Lat89A.ReadAsArray()
 
HDF_Lon89A = gdal.Open(HDF_bands[48][0])
HDF_Lon89A_array = HDF_Lon89A.ReadAsArray()


In the next step, a create a comma-separated file containing longitude, latitude, brightness values for each raster point. This comma separated file is then the input for gdal_grid. I loop through each pixel and write the three values (  longitude, latitude, brightness ) into a csv-file.

So for the 89GHz channel, I write both 89A and 89B to a csv-file:
 
#Add header line to textfile
textfile = open( AMSRcsv, 'w')
textfile.write('lon,lat,brightness\n')

## Loop through each pixel and write lon/lat/brightness to csv file
for i in range(rows):
    for j in range(cols):
        wgs84=pyproj.Proj("+init=EPSG:4326")
        EPSG3411=pyproj.Proj("+init=EPSG:3411")
     
        lonA = HDF_Lon89A_array[i,j]
        latA = HDF_Lat89A_array[i,j]

        # lon/lat written to file already projected to EPSG:3411
        (lonA_3411, latA_3411) = pyproj.transform(wgs84, EPSG3411, lonA, latA)
        brightnessA = HDF_Br89AH_array[i,j]* 0.01 #APPLYING SCALING FACTOR!

        lonB = HDF_Lon89B_array[i,j]
        latB = HDF_Lat89B_array[i,j]

        # lon/lat written to file already projected to EPSG:3411
        (lonB_3411, latB_3411) = pyproj.transform(wgs84, EPSG3411, lonB, latB)
        brightnessB = HDF_Br89BH_array[i,j]* 0.01 #APPLYING SCALING FACTOR!

        if 35 < latA < 90:
            textfile.write(str(lonA_3411) + ',' + str(latA_3411) + ',' + str(brightnessA) + '\n')

        if 35 < latB < 90:
            textfile.write(str(lonB_3411) + ',' + str(latB_3411) + ',' + str(brightnessB) + '\n')
       
textfile.close()


For the lower resolution channels, I use the odd numbers of the 89A long and lat channel:


 
#Add header line to textfile
textfile = open( AMSRcsv, 'w')
textfile.write('lon,lat,brightness\n')

## Loop through each pixel and write lon/lat/brightness to csv file
for i in range(rows):
    for j in range(cols):
       wgs84=pyproj.Proj("+init=EPSG:4326")
        EPSG3411=pyproj.Proj("+init=EPSG:3411")

        #For low resolution the odd columns of Lon/Lat89 array to be taken!
        lonA = HDF_Lon89A_array[(i) ,(j*2+1)]
        latA = HDF_Lat89A_array[(i) ,(j*2+1)]

        # lon/lat written to file already projected to EPSG:3411
        (lonA_3411, latA_3411) = pyproj.transform(wgs84, EPSG3411, lonA, latA)
        brightnessA = HDF_SubDataset_array[i,j]* 0.01 #APPLYING SCALING FACTOR!

        if 35 < latA < 90:
            textfile.write(str(lonA_3411) + ',' + str(latA_3411) + ',' + str(brightnessA) + '\n')

textfile.close()

Now I can almost run the gdal_grid, but as described at http://www.gdal.org/gdal_grid.html I need to create a xml file describing my comma-separated csv file.
 
<OGRVRTDataSource>
<OGRVRTLayer name="GW1AM2_201301010834_032D_L1SGRTBR_1110110_channel89H">
<SrcDataSource>G:\AMSR\GW1AM2_201301010834_032D_L1SGRTBR_1110110_channel89H.csv</SrcDataSource>
<GeometryType>wkbPoint</GeometryType>
<GeometryField encoding="PointFromColumns" x="lon" y="lat" z="brightness" />
</OGRVRTLayer>
</OGRVRTDataSource>

This xml file above can be created in a python script in the following manner, some more info
here:
root = ET.Element("OGRVRTDataSource")

OGRVRTLayer  = ET.SubElement(root, "OGRVRTLayer")
OGRVRTLayer.set("name", AMSRcsv_shortname)

SrcDataSource = ET.SubElement(OGRVRTLayer, "SrcDataSource")
SrcDataSource.text = AMSRcsv

GeometryType = ET.SubElement(OGRVRTLayer, "GeometryType")
GeometryType.text = "wkbPoint"

GeometryField = ET.SubElement(OGRVRTLayer,"GeometryField")
GeometryField.set("encoding", "PointFromColumns")

GeometryField.set("x", "lon")
GeometryField.set("y", "lat")
GeometryField.set("z", "brightness")

tree = ET.ElementTree(root)
tree.write(AMSRcsv_vrt)


Now we finally can run gdal_grid, either command line:

gdal_grid -a_srs EPSG:3411 -a average:radius1=4000:radius2=4000:min_points=1 -txe -3850000 3750000 -tye -5350000 5850000 -outsize 760 1120 -l GW1AM2_201301010834_032D_L1SGRTBR_1110110_channel89H GW1AM2_201301010834_032D_L1SGRTBR_1110110_channel89H.vrt GW1AM2_201301010834_032D_L1SGRTBR_1110110_channel89H.tif

or called from a Python script:

AMSRcsv_shortname =  GW1AM2_201301010834_032D_L1SGRTBR_1110110_channel89H
AMSRcvs_vrt = GW1AM2_201301010834_032D_L1SGRTBR_1110110_channel89H.vrt
AMSR_tif = GW1AM2_201301010834_032D_L1SGRTBR_1110110_channel89H.tif  

radius1 = 4000  
radius2 = 4000  
os.system('gdal_grid -a_srs EPSG:3411 -a average:radius1=' + str(radius1) + ':radius2=' + str(radius2) + ':min_points=1 -txe -3850000 3750000 -tye -5350000 5850000 -outsize 760 1120 -l ' + AMSRcsv_shortname + ' '  + AMSRcsv_vrt + ' ' + AMSR_tif)

The radius indicates in which distance around a given output raster point the algorithm searches for points falling into the NSIDC raster -- if too small, it will result in empty pixels, if too large there will be too much smoothing since many pixels are averaged into one.

The result is, finally, the part of the swath falling into the NSIDC-raster:


In a final step, I take all of such swaths for one day and average them into a full image of that given day, see the Average Daily function in the script for details (images  © JAXA EORC ).


One issue using gdal_grid is its very low performance regarding speed (see this comment ), one 89GHz band takes 10 minutes and a lower resolution band 2 minutes calculation time. This is then about 25 minutes for all channels of one hdf file, and since every day has about 20 files, this means 8 hours for one averaged daily raster. gdal_grid may therefore not always be feasible until the speed issue is improved.

My Corner of the WebNaming conventions on Ruby & Ruby on Rails

Originally posted on Selva's Blog:

Ruby Naming Conventions

Local Variables
Lowercase letter followed by other characters, naming convention states that it is better to use underscores rather than camelBack for multiple word names, e.g. mileage, variable_xyz

Instance Variables
Instance variables are defined using the single “at” sign (@) followed by a name. It is suggested that a lowercase letter should be used after the @, e.g. @colour 

Instance Methods
Method names should start with a lowercase letter, and may be followed by digits, underscores, and letters, e.g. paint, close_the_door

Class Variables
Class variable names start with a double “at” sign (@@) and may be followed by digits, underscores, and letters, e.g. @@colour

Constant 
Constant names start with an uppercase letter followed by other characters. Constant objects are by convention named using all uppercase letters and underscores between words, e.g. THIS_IS_A_CONSTANT

Class and Module 
Class and module names starts with an uppercase letter, by convention they…

View original 497 more words


My Corner of the WebWhy You Should Test

Originally posted on Six Months of Ruby:

Test driven development is both loved and loathed by software engineers everywhere. It’s a complicated relationship. While the benefits of TDD are well established, programmers are human, and the human mind is reluctant to associate non visible progress with perceived progress.

However, saying “I’ll get this done faster if I don’t write tests” is like saying “I’ll get to china faster by going in a straight line.” It’s a great sentiment if you’re willing to dig through a lot of rock. The fact is that writing tests might seem like taking a detour, but they also show you an easier path to where you want to go.

The Benefits

The first benefit of TDD is something almost intangible. Writing tests forces you to think about what you want to do. It forces you to plan, to design, and to define in your own mind what needs to be done. This…

View original 166 more words


It's All About DataAutomatically Sync Data with Google Maps Engine

Sean Wohltman of Google has published a series of video tutorials on how to automatically synchronize any data with Google Maps Engine. He uses a simple FME Workspace to check for updates on the data, and automates the workflow with an FME Server instance running on Google Compute Engine.

At the time of writing, there are three videos, but by the sounds of it, more might be on the way. The demos are clear and easy to follow, and offer a fantastic overview of what’s possible with Google Maps Engine and FME. Check them out below.

1. Simple translation from a JSON feed to Google Maps Engine

In the first video, Sean demonstrates how to create a simple workflow that reads data from an ArcGIS Server running on Amazon EC2, and loads it into Google Maps Engine. He then goes over what you can do with the data once it’s in GME—including styling the layer and info window, publishing a map, submitting it to Google Maps Gallery, and contributing to Google Crisis Map.

(Worth noting: Not shown in the video, the FME Data Inspector does offer background maps, which you can turn on in the preferences.)

While Sean reads from an Esri JSON feed of CAP images, it’s easy to see how any source data can be transformed and loaded into GME in a few simple steps. Anything goes in the Google Maps Gallery! If you don’t believe me, go look at this map of James Bond movies. (Then please sign my petition to have the next one filmed in Canada…)

2. Keeping the data synchronized

Data is ever-changing, so it’s often not enough to load it into the destination system just once. Take this map of coffee shop chains, for example. I’ll eat my limited edition Safe Software hat if that data hasn’t changed since it was loaded in 2013.

Sean’s second video shows how to keep the above data in sync. Since it’s probable that the ArcGIS Server in this scenario will get frequent updates, it’s important to keep the live data connected to the GME map.

Sean demonstrates how to create the workspace that checks for and inserts any new records that aren’t currently in the GME table. He then shows how to check for existing fields that have been updated.

3. Automatically check for changes and update the map

After the above two videos, we now have an FME Workspace that synchronizes the ArcGIS Server data with the Google Maps Engine table. The third video shows how to check for new or updated data at a set interval, and update the Google Maps Engine table as necessary.

Sean introduces us to FME Server, which he’s running on a Linux virtual machine inside of Google Compute Engine. He goes over how to publish a workspace to FME Server, and gives an overview of what you can do with the workspace once it’s there. He configures the server to automatically run this ArcGIS-to-GME job every half hour.

In the end, Sean’s workflow successfully syncs Civil Air Patrol photos from FEMA to Google Maps Engine using FME. What other scenarios can you envision? How about this one, which updates every 5 minutes with earthquakes around the world. (And then there’s this one of seismic hazards, which makes me think I should probably be hiding under my desk right about now.)

To learn more and see other scenarios like this, check out our many resources, including this webinar recording and blog post. If you’d like to try FME technology and connect your data with Google Maps Engine, download your free trial of FME Desktop. If you need a simple one-off translation to get your data into the Google Maps Gallery, try out the free Data Loader for Google Maps Engine.

We would love to see what you’re doing with FME and Google Maps Engine. Be sure to share your data challenges with us in the comments!

The post Automatically Sync Data with Google Maps Engine appeared first on Safe Software Blog.

GIS LoungeViews From the 2014 Esri International User Conference: GIS In Imagery

Photographer Kristina Jacob shares moments she captured from the 2014 Esri International User Conference that was recently held during July in San Diego, California.  One of the largest GIS conferences in the world, the Esri UC played host to over 15,000 attendees from around the world. The Esri International User [...]

The post Views From the 2014 Esri International User Conference: GIS In Imagery appeared first on GIS Lounge.

GeoServer BlogGeoServer 2.5.2 release

The GeoServer team is happy to announce the release of GeoServer 2.5.2. Download bundles are provided (zipwardmg and exe)  along with documentation and extensions.

GeoServer 2.5.2 is the next the stable release of GeoServer and is recommended for production deployment. Thanks to everyone taking part, submitting fixes and new functionality:

  • Some fixes in the new GetFeatureInfo engine, for unfilled polygons and dashed lines
  • Solved a configuration issue that might have linked two styles toghether (edit one, the other changed as well)
  • DBMS connection leak in some WFS GetFeature with bounds enabled
  • Have WPS work properly as a workspace specific service, and report the current process activity during asynchronous requests (for processes that do report what they are doing in plain english, besides offering a progress percentage)
  • Add a way to resctrict output formats supported by WMS GetMap and GetFeatureInfo
  • More docs on how to setup JNDI connection pools in GeoServer
  • Thanks to Andrea (GeoSolutions) for publishing this release
  • Check the release notes for more details
  • This release is made in conjunction with GeoTools 11.2

About GeoServer 2.5

Articles and resources for GeoServer 2.5 series:

 

AnyGeoCool Geo Technology – 10 Awesome Things seen at 2014 ESRIUC

TweetIndeed there’s no shortage of cool and amazing technology to touch, see, and hear about at ESRIUC, actually, its almost overwhelming so creating a list of just 10 awesome things I saw is no simple task. Hopefully I won’t crush … Continue reading

LiDAR NewsUSGS to Host Briefing: Safer Communities, Stronger Economies – in 3D

The USGS is doing an excellent job of selling the benefits and value of LiDAR-derived 3D information. Continue reading →

Click Title to Continue Reading...

All Points BlogShoppers: Don’t Track Me Unless…

PunchTab's "Mobile Tracking: Are Consumers Ready?" report surveyed 1,153 consumers about sharing their personal information, including locations, in return for deals and services. Fifty percent of participants did not want to be tracked and 27 percent said "maybe" it'd be ok, but only... Continue reading

All Points BlogGIS Government News Weekly: Saginaw Alerts, Irish Open Data, $83 Deeds

Saginaw Alerts Landlords of Police/Fire Calls When Saginaw police officers or firefighters respond to a call at one of Saginaw's nearly 5,000 registered rental properties, the landlord will be notified the following Monday. The new feature of the GIS goes live August 4. California... Continue reading

GIS LoungeFirst Landsat Spacecraft Launched – Today in Geospatial – July 23

On July 23, 1972, the first Landsat spacecraft was launched.  At the time it was known as the Earth Resources Technology Satellite and it was the first satellite launched to study the earth’s landmasses.  In 1975, the name was changed to Landsat.  Since then, this program has been continuously monitoring changes [...]

The post First Landsat Spacecraft Launched – Today in Geospatial – July 23 appeared first on GIS Lounge.

geomobLDNSeptember 16th #geomob details

I’m pleased to announce a fantastic lineup for our next #geomob
to be held at 18:30 on Tuesday the 16th of September at the facilities of the BCS at 5 Southampton Street near Covent Garden. We’ve had occasional #geomobs at the BCS in the past, they are generous hosts, and I am pleased to announce they are now our formal location sponsor! We will continue to have events elsewhere but will now be spending more time at the BCS. Many thanks to them for their generosity. If you are not yet a member of the BCS I encourage you to have a look at their offering.

Their facilities (and food) have been excellent, but in the past we’ve had the unfortunate situation that people have been turned away because our room was too full. I’ve been assured that we will have more than adequate space on the 16th, but to help us get a better gauge of the numbers and plan appropriately please be sure to sign up on the Lanyrd page for the evening (where you will also find a map to the venue). 

Our speakers for the evening will be:

I hope you agree it looks like a great lineup. Attendees will have the chance to crown the evening’s best speaker, awarding him or her with a free SplashMap and geo bragging rights. 

As always the formal portion of the evening will be followed by drinks
at a nearby pub, generously paid for by our sponsors: OpenCage, knowwhere consulting and SplashMaps, and I strongly encourage all to attend. Very often the informal discussions are as fruitful and informative as the talks.

If you can’t make it on the 16th please join us at the following #geomob which will be Tuesday, November 4th (we’ll be back at the BCS), and for those who really like to get a jump on things you can already sign up for Tuesday, January 13th which will be the first #geomob of 2015. For both of those events we’re still (at the time of writing) on the hunt for speakers, please volunteer if you’re doing anything of interest in the location space. There’s one other point worth noting: to date the #geomob speaker list has reflected the unfortunate lack of gender diversity in the neogeo community, and we strongly encourage speaker applications from female speakers to help address this imbalance.

Enjoy your summer, and I look forward to seeing you all on the 16th of September

Ed (@freyfogle)

OpenGeoBoundless Growth

Eddie Pickle

Boundless is growing but our focus remains helping customers build the best web mapping applications. We’ve recruited a stronger leadership team and are enhancing our current products while also developing new products and services to help customers in new ways.

We recently added Ann Johnson as our new CEO to strengthen our team and leadership. Ann’s experience with enterprise software, particularly her deep knowledge of security, greatly expand our expertise in critical areas of Spatial IT.

At the same time, we have been steadily adding capabilities to OpenGeo Suite to enable easier, more scalable web services deployments. To help our customers successfully build enterprise mapping applications, we’re advancing open source software like OpenLayers 3 and providing new SDK templates that leverage this new technology. We continue to “lower the cost of free software” by making secure, fault tolerant, and scalable deployments of our open source stack simpler, faster, and easier.

Additionally, some of our key initiatives include:

With Ann joining Boundless, I’m focusing my efforts on developing these and other capabilities to grow our business, with a particular emphasis on developing new partnerships to create a new geospatial ecosystem based on open source software. We believe the future is bright for open source–based solutions to address a variety of challenges, from publishing data to asset management to customer services. I look forward to working with our customers and partners to reshape this industry!

The post Boundless Growth appeared first on Boundless.

ROK Technologies BlogDownloading User Generated Graphics

The other day, while working on a JQuery ArcGIS JavaScript api web app I was developing, I was tasked with adding the ability for a user to download graphics they had added to the map. Although my immediate reaction was that this may not be feasible, I was intrigued about the possibilities this could lead to.  

After a quick Google search for ideas, I came across a JS2Shapefile and thought, “Great, the hard work is already done!”  JS2Shapefile is a JavaScript class that creates Shapefiles in the browser avoiding any calls to the server.  The Shapefiles that are exported are missing their projection file, which our client needs.   Back to the drawing board I went… 

It was time to create a model that would ingest the user-generated graphics and keep the spatial reference while exporting the data back to the user. Remembering there was an “Extract Data” tool that ESRI includes within ArcToolBox‘s Server Tools , I pursued leveraging that for my needs. First, I created a small file geodatabase (gdb) with 3 empty layers in it (polygon, line and point) and loaded those into ArcMap. Next, while in ArcMap, I used the ”Draw To” tools to draw a polygon, line and point and then converted those graphics to Shapefiles. Now I had all of the pieces to start my model.  The first tool I added to the model was the “Delete Features” tool. I set it up to use the polygon layer from the file gdb as the input to create an empty layer of it every time the tool runs. Next, I added and set up the “Append” tool to use the empty polygon layer as the Target layer and also created a Feature Set variable (as a model parameter) with the polygon Shapefile as its source. The last tool I added was the “Extract Data” script that ESRI has already created.  I used the polygon layer from the file gdb as the layer to clip, created Spatial Reference and Feature Format variables (also as model Parameters),to allow for more download options in addition to Shapefile). I replicated the model for each of the geometry types and published the results to ArcGIS Server, one, resulting in3 separate Geoprocessing Services. 

Model Layout


All that was left was how to pass the user-generated graphics to the services. After a quick look through the JS API reference page I found my solution: I needed to create a Feature Set out of the graphicsLayer that I was using to display those user-generated graphics. To do that, I gave the graphicsLayer an ID and used the function map.getLayer(id of graphicsLayer).graphics to pass into the Geoprocessing Service. That was it. Now I have a tool that will take user-generated graphics and export them as various formats: Shapefile, File Geodatabase, Autodesk AutoCAD and Bentley Microstation Design (V8).

Sample JavaScript Code:

var drawPointGraphic = new GraphicsLayer({id:"drawPointGraphic"});

var drawLineGraphic = new GraphicsLayer({id:"drawLineGraphic"});

var drawPolyGraphic = new GraphicsLayer({id:"drawPolyGraphic"});

function convertPointGraphicToShapefile(){                   

   var featureSet = new FeatureSet();

   featureSet.features = map.getLayer("drawPointGraphic").graphics;

   if(featureSet.features.length != 0){

        $('#indicatorptSHP').show();

        var gpparams = { "Feature_Set": featureSet, "Feature_Format": $("#exportPointType").val()  };

        gpPoint.execute(gpparams,  getShapeFiles, gpJobStatus, gpJobFailed);

   }

}     

function convertLineGraphicToShapefile(){

      var featureSet = new FeatureSet();

      featureSet.features = map.getLayer("drawLineGraphic").graphics;

      if(featureSet.features.length != 0){

           $('#indicatorlnSHP').show();

           var gpparams = { "Feature_Set": featureSet, "Feature_Format": $("#exportLineType").val() };

          gpLine.execute(gpparams,  getShapeFiles, gpJobStatus, gpJobFailed)

     }            

}     

function convertPolygonGraphicToShapefile(){                 

   var featureSet = new FeatureSet();

   featureSet.features = map.getLayer("drawPolyGraphic").graphics;

   if(featureSet.features.length != 0){

        $('#indicatorpolySHP').show();

        var gpparams = { "Feature_Set": featureSet, "Feature_Format": $("#exportPolyType").val()  };

        gpPolygon.execute(gpparams,  getShapeFiles, gpJobStatus, gpJobFailed);   

   }                       

}

function getShapeFiles(result, messages){                    

window.open(result[0].value.url);

}
 

All Points BlogGIS =

Yet again I found an article where GIS expanded as Global Information System. Falkowski is a graduate of Valparaiso University in planning and building and with global information systems, or GIS. Falkowski worked for Lake County Planner Ned Kovackevich and had also been active in... Continue reading

GIS CloudOffline Maps are out!

Offline Maps are now available in Mobile Data Collection app for both iOS and Android devices.

The Offline Maps option makes a map available when there is no Internet connection available for a defined area.

You can define your area of interest while being online by choosing the area to download and panning/zooming to the area of capture.

upload1offline

Two levels of zoom will be available by default but you can change the zoom level by sliding the Levels button to the desired value.

This is still an experimental phase of the Offline Maps so we invite you to test this feature and send us your feedback.

Contact us for more information.

LiDAR NewsCost of LiDAR Sensors an Issue for Driverless Market

I was interviewed and quoted in this article published today in the Wall Street Journal entitled, "Laser Eyes Pose Price Hurdle for Driverless Cars." Continue reading →

Click Title to Continue Reading...

Footnotes