Planet Geospatial

AnyGeoOnline Mapping for Beginners – The Map Academy from CartoDB

TweetSomething innovative from the CartoDB crew (timely given their recent announcement of CartoDB Enterprise) … The Map Academy, a series of online mapping courses – Learn to create maps on the web and visualize geospatial data. The first course is … Continue reading

AnyGeoVideo – The Known Universe

TweetAn amazing video essentially of Everything! Developed by The American Museum of Natural History along with The Rubin Museum of Art, located via the Graffiti Kings facebook page – impressive indeed! // Post by Graffiti Kings.  

Directions MagazineOSIsoft & Esri Partner to Deliver Unified Solution for U.S. Federal Government Organizations

Directions MagazineFACT SHEET:  White House Innovation for Disaster Response and Recovery Demo-Day

AnyGeoEDU Resource – ESRIUC Papers and Tech Workshop Presentations now Online

TweetHere’s a great resource from Esri that has come out of the 2014 ESRIUC. the company has now generously made available the technical workshop presentations and papers from the 2014 conference. Browse the vast collections of papers and presentations, more … Continue reading

North River GeographicFinding Nemo….errrrrr Data.

This all started off with a small project I am doing for a non profit. The project will possibly get a bit larger as time goes on but we are still in the beginning stages of it. My client has no knowledge of GIS. They’ve been haphazardly locating hospitals with which to work and placing them on a map. Which to us seems crazy, but to them it gets the job done. I’ve been trying to put together an estimate for work and there are way too many unknowns. One estimate had me way to high….one had me way too low.

So yesterday I started digging. I won’t name states – but I will give you an overview of the process. I start this with “They’ve been hand digitizing hospitals”.

  • My first thought was the National Map…except something like Hospital locations are way outside the scope of that project.
  • My next guess was census which seems to have some hospital locations but they are polygons. I would like better than centroid and I would like something more “authoritative” coming from the states.

I decided to visit each states “GIS” setup…department….portal….website…..library. This takes place over 5 hours….I think…maybe more.

  • With two of the states I was able to google “Statename Hospital shapefile” and find exactly what I was looking for in less than two clicks.
  • Every major university has a half-assed page devoted to GIS links. Yes I said half-assed. Most of these aren’t curated or maintained.
  • In every search mapcruzin was ranked at the top or close to the top for data searches. I have 0 idea where this data came from – so much so I didn’t care to even investigate it.
  • One state’s search returned metadata (Yay) which pointed me to an online link that was dead (Boo). I vented over twitter and ended up using the Internet Wayback Machine thanks to @jalbertbowdenii.   He then pointed me to the new states data “portal” and I found that data. I had been searching for “Statename GIS Data” and was really coming up empty. All that came back was a website with no clear link to data. Lots of pics and mentions of awards and presentations…..but data.
  • One state search lead me exactly to that states data portal. A search on that portal for “shapefile” returned 0 results. After some exploring I discovered a second link to another portal which was down for maintenance. I ended up emailing the admin who has pointed me in another direction. There was another link that took me to a google map of all the states hospitals with no way to look at the data – just the map.
  • One state had all of their data as either “statewide” or “county”. If you didn’t find data in one you had to look in the other one. Of course hospitals were called “acute care health facilities” or something that to me said “here is where I will be at 70″ and not in a hospital. Data was in a executable zip file.
  • One states search had me register with their data portal. I downloaded the data only to find it in an ESRI personal geodatabase format. I have ArcGIS Desktop so I opened it and exported it out. I could have done this in QGIS – but time. This website also had a ton of .e00 files. What are those you ask? Look it up kids.
  • One state’s data portal was being replaced…or had been. So all of their links were dead. One link did work – the states Health department where a search for hospitals returned a 50 some odd page PDF with hospital information and addresses scattered about.
  • Right now I’m down to 2 emails and I’ve got the region covered in hospitals. except now I’m up to 8 hours of time spent on something that shouldn’t be this hard. I know 8 hours compared to hand digitizing all this data is nothing…but….8 hours to find location data for hospitals? I’m not searching for the nearest fast food restaurant. I’m searching for a building to provide health care.

So I’ve decided that the idea of accessible GIS Data must really be a joke on some level. We hear about it. We talk about it. Vendors tell us how open their product is and how data is being shared. I’m up to 8 hours in my search. Crazy that they were digitizing this data by hand? Not so much when you look at what you have to go through to find data. How hard is it to find data relating to hospitals for your state. Look. In some cases it will be easy…in some……it’s really a joke. Open data doesn’t mean much if you can’t find it.

No two states had the same setup. Almost all on some level were discussing their “data portal” or “GIS Portal”. Only in one instance did metadata come into play….and it had the wrong online linkage. Why not put your metadata up as a page with the right online linkage? 10 states, 10 pages, 10 links, and 10 downloads. Silly – because you don’t get to spend half a year building a portal with pics of people and awards and presentations.

For those of you with data portals that are jut that – spots to share and download data. THANK YOU.

Except – I can download Mars Geology Shapefiles. Use Google for that and it’s the first return….



Making Maps: DiY CartographyMapa de los Paisajes de Cuba | Map of the Landscapes of Cuba | 1949 | Gerardo Canet & Erwin Raisz



The Atlas de Cuba by Gerardo Canet and Erwin Raisz, featured in a recent posting here, is accompanied by a large color map entitled Mapa de los Paisajes de Cuba (Map of the Landscapes of Cuba). The map is a hybrid of Raisz’s landform map style supplemented with diverse human landscape components. Canet and Raisz explain their methodology:

The accompanying map of Cuba is a new experiment in cartography. Color suggests land types: cultivated fields, pastures, mountains, swamps, valleys, etc. The symbols were selected after a series of flights over the Island and on analysis of numerous color photographs taken from the air It is expected that in this way the map will better reflect reality; more closely resembling on air view of the Island than the conventional maps we now have.

This approach is part of a tradition of natural or real color mapping combining terrain (in particular, shaded relief) with air imagery or map symbolization inspired by air imagery, an obvious outcome of aerial mapping in the early part of the 20th century. An article by Tom Patterson and Nathaniel Vaughn Kelso entitled Hal Shelton Revisited: Designing and Producing Natural-Color Maps with Satellite Land Cover Data (2004) delineates the author’s development of the Natural Earth data (shaded terrain + satellite land use data) in the context of earlier, related work by Hal Shelton, Eduard Imhof, Heinrich Berann, Richard Edes Harrison and Tibor Toth. It seems that Raisz was also an innovator in this realm of air-imagery inspired map design.


See also, at this blog, Raisz’s History of American Cartography TimelinesMap Symbols: Landforms & Terrain, and Raisz’s currently available landform maps at



Mapa de los Paisajes de Cuba, 1949 (36.6mb)


Map details:

002_Cuba_Canet_Raisz_Map 003_Cuba_Canet_Raisz_Map 004_Cuba_Canet_Raisz_Map 005_Cuba_Canet_Raisz_Map 006_Cuba_Canet_Raisz_Map 007_Cuba_Canet_Raisz_Map 008_Cuba_Canet_Raisz_Map 009_Cuba_Canet_Raisz_Map 010_Cuba_Canet_Raisz_Map 011_Cuba_Canet_Raisz_Map 012_Cuba_Canet_Raisz_Map 013_Cuba_Canet_Raisz_Map 014_Cuba_Canet_Raisz_Map

geoMusingsSlow Food

In 1985, I was a junior in high school and I got my first job at a local chain steakhouse. I ended up staying there for a few years and did everything, including management. This particular location happened to be the busiest store in the chain, which had a couple hundred locations at the time. Basically, we just unlocked the doors and people came in. We often had a line and managers from all over the country came to see how we did business.

Eventually, I transferred to another store that happened to be much slower. I expected this to be a something of a cakewalk compared to the store I had just left. Sometime during my first day, the long-time manager made a point to remind me to close the back dining room after lunch and turn the lights out. The dark room looked uninviting to me so I asked if we could leave the lights on. He replied that doing so would raise the electric bill and affect the store’s profits. Over the next few weeks, I learned so many new techniques for managing food inventory, staffing levels, and equipment that I realized my initial impression was wrong. So I told the manager this. He was not surprised.

He said to me: “It takes more skill to run a slow store than a busy one.”

He was absolutely right. At the first location, we would simply order enough food to fill the walk-ins and did enough prep every day to fill the racks. We quickly ramped our staffing up to max levels every day and just waited for the customers to roll in. There was a myriad of issues we simply never had to worry about because sales volume masked them.

It’s been many years since I’ve had to work in a restaurant but I have found this observation applicable throughout my career, during which I have worked primarily in the defense industry, designing and building various forms of geospatial tools and systems. Two weeks ago, I attended the Esri International User Conference. The Esri UC is full of opportunities to see more of these types of applications, from the Defense Showcase to the National Security Summit to many defense/intel/homeland security related paper sessions.

I attended almost none of them.

Over the past few years, I have developed the habit of attending sessions related to local governments, non-profits and other small organizations. These organizations tend to have budgets that are essentially rounding error in the defense world and I have become much more intrigued by the solutions they build to meet their missions with so few resources.

Of course, I was at an Esri conference, so everyone I saw was an Esri user. It’s no secret that Esri tools tend to be expensive and every dollar counts for small governments, so I am intrigued how these organizations arrive at the choice of Esri tools in light of so many other capable options. Too often, there is a misconception that these small governments simply choose Esri-based solutions out of a lack of understanding of alternatives. What I have discovered is, like the manager of the slow restaurant, some of the best business understanding and most cogent and well-reasoned justifications come from these small users.

They seem to take a much more holistic approach to evaluating their solutions; including the availability of support from the technology provider as well as the existence of a robust third-party community that enables them to effectively compete for value-added services, customizations, and consulting. What I found interesting has been the most consistent attraction of local governments and small organizations to Esri-based solutions: the seamless integration of the entire ArcGIS platform. From mobile to desktop to web, they find that Esri tools make it easier to move through the product lifecycle. This exposes a very business-centric way of thinking that is larger than any individual technology or its licensing cost.

As I indicated above, an Esri conference places a huge filter on the users whom you meet. As I attend less vendor-centric events in the months ahead, I’d like to continue this line of inquiry. While landing the giant white whales of large Federal agencies might be more satisfying for the short-term bottom line, it is important to remember that all levels of government are facing downward budget pressure. Learning the thought processes of organizations that have always had to tightly manage limited resources may be more valuable in the long run.

LiDAR NewsGlobal 3D Scanning Market to Grow at 12.4% CAGR

The market would remain to be dominated by the North American and European region, collectively contributing to more than two third of the global revenue share. Continue reading →

Click Title to Continue Reading...

Directions MagazineFirst Satellite Masters Conference Set to Launch in Berlin

All Points BlogMichael Byrne Nominated for Service to America Medal

The awards are new to me! The Partnership for Public Service annually honors outstanding federal employees who have made a significant difference in the lives of Americans, presenting awards that have come to be known as the "Oscars of government service." Renamed the Samuel J.... Continue reading

Directions MagazineEsri Canada Launches GIS Centres of Excellence in Higher Education

Directions MagazineDGI Conference: Geospatial Intelligence Vital In Investigating Malaysia Flight MH17

Directions MagazineParks Associates: 78% of connected car owners will demand connected features in their next vehicle

All Points BlogGeospatial Key Part of President’s Climate Data Initiative “Food Resilience” Effort

Today the Obama Administration is rolling out the Climate Data Initiative “Food Resilience” effort, "aimed at empowering America’s agricultural sector and strengthening the resilience of the global food system in a changing climate." Some geospatial highlights... Continue reading

All Points BlogGeospatial Key Part of President’s Climate Data Initiative “Food Resilience” Effort

Today the Obama Administration is rolling out the Climate Data Initiative “Food Resilience” effort, "aimed at empowering America’s agricultural sector and strengthening the resilience of the global food system in a changing climate." Some geospatial highlights... Continue reading

All Points BlogWhite House: Tackling Data from Earth Observation Systems & Reviews Apps for Disaster Response

The White House is keeping busy. In two separate efforts the Office of Science and Technology Policy (OSTP) is reviewing geospatial technology initiatives in earth observation systems and in disaster response and recovery. On July 18th, the OSTP released a report on the National Plan... Continue reading

Directions MagazineTown of Laubach, Germany Implements Mobile Alert from Hexagon Geospatial

Directions MagazineAerial Mapping Company Bluesky Expands with New Appointments

The Map Guy(de)GovHack 2014 post-mortem

Earlier this month, I attended the GovHack 2014 hackathon, along with thousands of other fellow hackers all across the country. This was my first GovHack, but not my first hackathon. My previous hackathon was RHoK and having no idea how GovHack would turn out, I entered the GovHack event with a RHoK-based mindset of how I would expect this hackathon to turn out.

Bad idea.

I learned very quickly there was a major difference between RHoK and GovHack. Going into RHoK, you have an idea about what solutions you will get to hack on over the weekend as problem owners are present to pitch their ideas to the audience of prospective hackers. With GovHack, you need an idea about what solution you want to hack on over the weekend, all they were going to provide was the various open data and APIs. What on earth are we going to build?

So after losing nearly half the weekend to analysis paralysis, our team (named CreativeDrought, wonder why?) agreed with my suggestion of just building a MapGuide-based mashup of various open datasets, most notably, the VicRoads Crash Stats dataset and related transportation data. I obviously knew MapGuide inside-and-out and its capabilities to have a level of confidence that with the remaining weekend we should still be able to crank out some sort of workable solution. At the very least, we'd have a functional interactive map with some open data on it.

And that's the story of our CrashTest solution in a nutshell. It's a Fusion application, packed to the gills with out-of-the-box functionality from its rich array of widgets (including Google StreetView integration). The main objective of this solution was to allow users to view and analyse crash data, sliced and diced along various age, gender, vehicle type and various socio-economic parameters.

MapGuide's rich out-of-the-box capabilities, Maestro's rapid authoring functionality and GDAL/OGR's ubiquitous data support greatly helped us. I knew with this trio of tools, that we could assemble an application together in the remaining day and a bit left that we had to actually "hack" on something.

Sadly, we only got as far as putting the data on the map for the most part. Our team spent more time frantically trying to massage various datasets via ogr2ogr/Excel/GoogleDocs into something more usable than actually writing lines of code! Seriously VicRoads? Pseudo-AMG? Thank goodness I found the necessary proj4 string for this cryptic coordinate system so that we could re-project a fair chunk of the VicRoads spatial data into a coordinate system that better reflects the world we want to mash this data up with!

Still, our "solution" should hopefully still open up a lot of "what if" scenarios. Imagine looking at a cluster of accident events, not being able to ascertain any real patterns or correlation and so you then fire up the StreetView widget and lo-and-behold, Google StreetView providing additional insights that a birds-eye view could not. Also imagine the various reporting and number crunching possibilities that are available by tapping into the MapGuide API. Imagine what other useful information you could derive if we had more time to put up additional useful datasets. We didn't get very far on any of the above ideas, so just imagine such possibilities if you will :)

So here's our entry page if you want to have a look. It includes a working demo URL to a Amazon EC2 hosted instance of MapGuide. Getting acquainted with Amazon Web Services and putting MapGuide up there was an interesting exercise and much easier than I thought it would be, though I didn't have enough time to use the AWS credits I redeemed over the weekend to momentarily lift this demo site out of the free usage tier range performance-wise. Still, the site seems to perform respectably well on the free usage tier.

Also on that page is a link to a short video where we talk about the hack. Please excuse the sloppy editing, it was obviously recorded in haste in a race against time. Like the solution and/or the possibilities it can offer? Be sure to vote on our entry page.

Despite the initial setbacks, I was happy with what we produced given the severely depleted time constraints imposed on us. I think we got some nice feedback demo-ing CrashTest in person at the post-mortem event several days later, which is always good to hear. Good job team!

So what do I think could be improved with GovHack?
  • Have a list of hack ideas (by participants who actually have some ideas) up some time before the hackathon starts. This would facilitate team building, letting participants with the skills, but without ideas easily gravitate towards people/teams with the ideas.
  • The mandatory video requirement for each hack entry just doesn't work in its current form. Asking teams to produce their own videos puts lots of unnecessary stress on teams, who not only have to come up with the content for their video, but have to also deal with the logistics of producing said video. I would strongly prefer that teams who can/want to make their own video do so, while other teams can just do a <= 3 minute presentation and have that be recorded by the GovHack organisers. Presentations also lets teams find out how other teams fared over the weekend. While everyone else in the ThoughtWorks Melbourne office was counting down to the end of the hackathon, I was still frantically trying to record my lines and trying not to flub them! I raided the office fridge for whatever free booze that remained just to calm myself down afterwards. I don't want to be in that situation ever again!
  • Finally, the data itself. So many "spatial" datasets as CSV files! So many datasets with no coordinates, but have addresses, horribly formatted addresses, adding even more hoops to geocode them. KML/KMZ may be a decent consumer format, but it is a terrible data source format. If ogr2ogr can't convert your dataset, and requires a manual intervention of QGIS to fix it, then perhaps it's better to use a different spatial data format. Despite my loathing of its limitations, SHP files would've been heavily preferred for all of the above cases. I've made my thoughts known on the GovHack DataRater about the quality of some of these datasets we had to deal with and got plenty of imaginary ponies in the process.
Despite the above points, the event as a whole was a lot of fun. Thanks to the team (Jackie and Felicity) for your data wrangling and video production efforts.

Also thanks to Jordan Wilson-Otto and his flickr photostream where I was able to get some of these photos for this particular post.

Would I be interested in attending the 2015 edition of GovHack? Given I am now armed with 20/20 hindsight, yes I would!

Technical RamblingsGoPro: Upgrade Complete

In May, I said my next upgrade would be a GoPro. Last week, I confirmed that, as I bought a GoPro Hero 3+ Black. (Thanks to my employer, I even got a 40% discount.) For the first few days, I played with it — did a sunset timelapse, and contributed to the internet cat database by adding pictures of the kitties in a charming 2.7k video. (2.7k probably only available in Flash or Chrome.)

But in part, my reaction was “Enh. What did I think I needed this for?” After all, I already had a camera for my quadcopter (in the form of the A5-powered FC40 camera); I had a DSLR (which can shoot great 720p videos); I had my phone, which could do decent 1080p video, and I always had on me. Why exactly did I need *another* camera?

Today, I finally took the GoPro up on the Phantom, and that particular concern is no longer.

I was recording video, so I didn’t get any particularly high res stills — but that wasn’t really a concern. Shooting 1080p video, I was able to get great shots of both the storm clouds looking out west of town:

And another photo looking East towards Boston.

The colors, the clarity, the overall picture and video quality … this is what I kept seeing in everyone else’s photos, even though I wasn’t getting it in mind. I always wondered if the problem was just my tools… and now I know. As with everything, the user makes a difference… but the tools certainly help.

GeoSprocket Community JiveMoving to a New Gig

After many fruitful years of operating Geosprocket LLC, today I'll be joining the crew at Faraday, working as the lead visualization engineer. This is an exciting time to be working with data science and data design, and I'm psyched to be a part of a talented team digging into it with modeling, mapping, charting and other assorted insanity. I may even be blogging from time to time under the Faraday banner and occasionally on my own over at Medium.

I'll be wrapping up a series of existing projects, but then Geosprocket will be on hold. Thanks to all of you who have been such great clients, collaborators, mentors and friends. I look forward to keeping the conversation going!

My Corner of the WebIf You Want a Job, Wake Up and Smell the Coffee

Bob Roberts:

As a Rails Developer and one that hires developers, I can attest to “the unemployment rates range from less than 1% to just over 3%”

Originally posted on Joblink@Work Blog:


I must confess.  Although I like gadgets, I am not a techie per se.  Many of us enjoy technology and believe that we are well versed in it.  But, I would call that the End-User Syndrome.  That is, we enjoy the benefits of our smart phones and mobile devices.  We love to be able to do all sorts of things online, including looking up answers quickly, making purchases, and paying bills (OK, that’s not so much fun).  However, few of us would ever be interested in assembling hardware or even writing the code which drives the Internet or the apps we use.

Today, I had an opportunity to meet with recruiters Alexa and Lee at a technology recruiting company in my area.  Following an explanation of what I do, I asked them about their staffing needs.  I also asked them what areas were hard-to-find and therefore represent opportunities for current…

View original 1,697 more words

GIS LoungeUsing Remote Sensing to Measure the Affect of Drought on Ground Water

Scientists at NASA’s Goddard Space Flight Center have been experimenting with a new data product to assess groundwater and soil moisture drought indicators. The maps produced are used by the U.S. Drought Monitor and use data from NASA’s Gravity Recovery and Climate Experiment (GRACE) satellites.   In 2002, the GRACE satellites were launched [...]

The post Using Remote Sensing to Measure the Affect of Drought on Ground Water appeared first on GIS Lounge.

All Points BlogPrivate and Public Colleges Urge FAA to Allow Researchers to Use Drones

Tiny Smith College in Northampton, Massachusetts took the lead on an open letter sent to the Federal Aviation Administration last week to protest that researchers don't have access to the same hobbiest model planes that ten year olds do. Current regulations prevent commerical users and... Continue reading

All Points BlogPrivate and Public Colleges Urge FAA to Allow Researchers to Use Drones

Tiny Smith College in Northampton, Massachusetts took the lead on an open letter (pdf) sent to the Federal Aviation Administration last week to protest that researchers don't have access to the same hobbiest model planes that ten year olds do. Current regulations prevent commerical users... Continue reading

AnyGeoYes indeed, you can now download Mars Geology data in SHP files

TweetYes indeed, you can now start mapping Mars as the USGS and NASA have made available (in SHP format) Mars geology, structures, landing sites and more – Twitter user Larry Buchanan @larrybuch has taken the data and dropped em into … Continue reading

VerySpatialA VerySpatial Podcast – Episode 471

A VerySpatial Podcast
Shownotes – Episode 471
27 July 2014

Main Topic: Our conversation with Rick and Rusty from Eating West Virginia

  • Click to directly download MP3
  • Click to directly download AAC
  • Click for the detailed shownotes


  • When I am King by Great Big Sea

  • News

  • OSTP releases National Plan for Civil Earth Observation
  • Google Coordinate now part of Maps Engine Pro
  • Google partners with Environmental Defense Fund to map gas leaks
  • MapBox shows Don’t Fly Here

  • Web Corner

  • Foss4g Academy

  • Main topic

  • This week we feature our conversation with Rick Lawson and Rusty Hefner, hosts of the new blog Eating West Virginia, about their new undertaking and food geography.

  • Events Corner

  • 22nd IAHR Symposium on Ice: 11-15 August, Singapore
  • 5th Digital Earth Summit: 9-11 November, Nagoya, Japan – abstracts extended to Aug. 9th
  • Digital Past 2015: 13-14? February, Swansea, Wales, UK
  • International Symposium on Glaciology in High-Mountain Asia: 2-6 March, Kathmandu, Nepal
  • LiDAR NewsAutomated Vehicles – Autos vs. Computers

    Brad Templeton explains in this thought provoking article that there are two cultures who are thinking about automated vehicles. Continue reading →

    Click Title to Continue Reading...

    North River GeographicAnouncing an Open Forestry Template

    My longest running client has been a forestry company. Now – if you were to hand me an aerial photograph and I had it in stereo and go “Trees – start” – I would at least get the boundaries of the different types outlined. If you stuck me on the ground I essentially go “That’s pine and that’s not”.

    The forestry client has been an interesting project. As of late (over the last month) I’ve been slowly migrating them into postgis and qgis. They have arcview and we’re going to use that for map production (unless I keep getting better with cartography in QGIS and then that is questionable). Moving to postgis has made the data a bit more cryptic to them but infinitely easier to manage for me.  So much so that this mornings call of “how many acres are currently on tract 31″ and one small script later I had a frequency table describing types of trees and acres (which you can’t do at the arcview level of licensing (and I know it’s not called arcview)).

    This database has grown from 2 shapefiles to over 22 shapefiles before the merge into postgis. Like I said it’s a bit more cryptic but easier to manage and as I keep changing it and getting it “stable” and “documented” life will be better. The whole idea of this database has evolved also out of a couple of conversations with other forestry minded individuals:

    • I’ve been using QGIS because it’s free – just not supported.
    • I’m still using arcview 3.x because I know how to use it.
    • I’m not buying any more software to manage my GIS (in reference to needed arceditor)

    So I’ve decided to do something weird and hopefully wonderful. If you have a commercial GIS setup – you can download a pre-built database for your industry. It’s a pretty great way to do things actually. What if you don’t have that? You start piece mealing things together that makes sense for you.

    I don’t have access to industry leaders or at least the people that show up at a conference. I do have a git hub account, an idea, and you. Lets build an Open Forestry Template (pre-built database). A common operating base for the rest of us. Now I’m going to take a narrow approach at first and work my way up. First thing that will be supported will be postgres/postgis/qgis.  The next thing will be qgis/spatialite. Next will be whatever I decide to do for ESRI’s Desktop product. Since this is free and open I can’t devote a millions hours – but I think this can work. If some of you want to help – help. Want to help fund development? I’m not opposed to that. Want me to help your forestry company out and implement it? Yippee. Know what you are doing and don’t need help? Download away.

    This actually blends pretty well with the Spatial Connect. Process over Software. This will be hopefully the making of a very good supported database for forestry people to use regardless of software.

    So right now I’m recording notes in markdown files Here: , I have thoughts for assets (which would be gates, bridges, and culverts) to stands to property boundaries. It’s not a lot – but it will start filling out as the weeks progress.

    Once I feel like I have enough info I’ll start making databases and setting things up for download. DON’T WORRY – Once this gets to a stable version I’ll make it easier for everyone to get to – Github can be too cryptic for us all to use and deal with – some of you guys are foresters and not full time GIS people. I completely understand. If you want to email me your ideas go for it. If you want to leave a comment on the blog with something that needs to go in feel free.

    Andrew Zolnai BlogFrom static to dynamic web maps, continued

    Last month I reported on posting on some time-stamped, time-slider or time-aware maps - I showed my WhereIsAndrew map - I also mentioned how time-sliders are a great way to roll-up diverse datasets that are time dependent. CLIWOC Captain's ships logs was the other example cited, and I proceeded to post it as a service from ArcMap on

    Due to the large number of points I had to first circumvent the 1000 point limit un - it's mostly to maintain browser and server performance - it's quite easy to change when you publish the service from ArcMap. I was warned however by Esri support that I might reach performance issues regardless, and mega-datasets are really meant to be on ArcGIS Server not on ArcGIS Online. Sure enough even if I filter the data to a single nationality, the time slider cycles through time frames faster that the browser can refresh them from ArcGIS Online.

    clkck image to enlarge or go direct to webmap

     So until I or Mapcentia mentioned last month add time-stamping capabilty on their postGIS backends, we're left with either loading data on servers ourselves, or posting data in time-slices, or posting layer packages as I've done earlier. QGIS also has time-stamping added now programmatically according to Anita Graser @underdarkGIS, tho I'm not sure how that translates server-side per Jo Cook's @archaeogeek caveat.

    So better let the heavy pulling to some, while the light speeds gallop on...

    (heavy-horse pulling and chuck-wagon racing from 2014 Calgary Stampede)

    LiDAR NewsUsing LiDAR to Measure Farm Air Quality

    “It lets us see what’s going on from many dimensions.” Continue reading →

    Click Title to Continue Reading...

    North River GeographicTwo Day QGIS training – August 21-22 2014 – Athens Georgia

    This is the first time this one has been attempted.

    The QGIS training has somewhat taken off. I’m not sleeping on a mattress of money but people are interested and I get more than a few inquiries as to when and where it will be held. Right now I plan on teaching a two day back to back training session: Intro to QGIS and QGIS: Data and Editing. Intro will be on August 21 2014 and Part II will be on August 22nd.

    Intro to QGIS covers the basics from exploring the interface to display of the data. We end that day building a map. This class really eases you into using the software.

    QGIS Data and Editing: goes more in depth. We cover the basics of editing as well as building data, georeferencing, symbology, and the geoprocessing toolbox. Right now I’m a little concerned this one is more than a day. It’s never been taught. So you won’t be an expert walking out but you should be able to figure out most of the questions you are going to have on QGIS.

    PLEASE NOTE: If you want to take the Part II class you need to have taken the Part I or BE VERY FAMILIAR WITH QGIS. I’m not explaining how Plugins work. I’m just going to say “Plugins”. I’m not going to explain how to add data – Just “Add Data”. There is a large amount of things I’m taking for granted in Part II. I have to to get through this one.

    Which leads me to… If I can sell out this class the first one breaks even. The second one is one track to break even a lot quicker than the first one. Regardless I should be able to do a few things to support the QGIS community if this training event is successful.

    Which leads me to pricing…..

    Either class by itself is $325. So come take the intro class for $325. Come take the Editing class for $325. TAKE BOTH FOR $500. Yes – I’m chopping the price if you sign up for both.

    Class will be held at ITOS in Athens Georgia

    address: 1180 E Broad St, Athens, GA 30602

    Why take the classes? If you’re struggling with Arceditor and Shapefiles. If you’ve ever wondered how you can stretch your budget using open source tools. If you’ve ever said “Man that free stuff isn’t all that great – it’s free”.

    Email me for details!



    LiDAR NewsrFpro Delivers Driving Simulation Solutions

    A quick introduction to rFpro showing highlights of the latest (2014) build of the Nordschleife Continue reading →

    Click Title to Continue Reading...

    AnyGeoFirst Look Video – Cedar Tree CMP1, mini Android smartphone from @cedar_tech

    TweetI’m pleased to share this one with you, shot on location at the 2014 ESRIUC in San Diego… likely your first chance to have a look at the really awesome, Cedar Tree CMP1, mini Android smartphone – My first look … Continue reading

    GeoServer BlogGeoServer Sunday Sprint (FOSS4G)

    Back in March we had a Save the Date post for FOSS4G 2014 in Portland. Check your travel plans include participating in the Code Sprint at the end of the conference.

    The code sprint is an opportunity for the team to get together and work on “tough problems without funding”. For GeoServer we have two candidates:

    1. Update to a recent version of Wicket to improve browser compatibility
    2. Update CITE Conformance Tests

    GeoServer is extending the code sprint to include:

    • Saturday, September 13th: FOSS4G is providing facilities at WhereCampPDX.
    • Sunday, September 14th: Boundless is arranging facilities at nedspace from 10am-4pm.

    To attend add your name to OSGeo wiki page and we will look forward to seeing you in Portland!




    Suite 250

    Suite 250, SW 11th Avenue, Portland

    Thanks to Mike Pumphrey for arranging the venue for the Sunday Sprint.




    GIS LoungeNational Transportation Atlas Database

    The Bureau of Transportation Statistics recently released the 2014 National Transportation Atlas Database.   From the BTS The DVD is a set of nationwide geographic databases of transportation facilities, transportation networks, and associated infrastructure. These datasets include spatial information for transportation modal networks and intermodal terminals, as well as the related [...]

    The post National Transportation Atlas Database appeared first on GIS Lounge.

    GIS LoungeEvaluating Ecosystems from Space

    The impact of humans upon the planet’s ecosystems has long been a concern for scientists, and now the European Space Agency (ESA) is taking their assessment of natural resources to a new level with satellite technology. The emphasis of a new project from the ESA is on promoting sustainability and [...]

    The post Evaluating Ecosystems from Space appeared first on GIS Lounge.

    All Points BlogUpdate 2: Esri Announces MOOC: Going Places with Spatial Analysis

    More info via various sources. Suggestion to Esri: It'd sure be nice if all of this information was on the MOOC home page. Start date: Sept 3 (via Learn ArcGiS) MOOC guest lecturers: (via @esrimooc):   David Gadsden - Esri Nonprofit Program coordinator. Geographer. Humanitarian.... Continue reading

    North River GeographicQGIS Meetup – July 31 2014 in Atlanta Georgia

    So I’m piggy backing this meeting off of two others – Georgia URISA and MAGS. If you wouldn’t mind registering with both MAGS and the QGIS Meetup group I would appreciate it.

    This first meeting is just to gauge interest in the group. I think there is enough t have one – and there doesn’t need to be any duplication with an existing GIS organization. My hope is that this leads to more meetings and more cross pollination where we start discussing more geospatial software. This isn’t just limited to Free and Open Source Software enthusiasts.  Come down and have a drink and meet the Georgia URISA folk and the MAGS people.

    thinkwhereNorth bar stone uncovered

    Via Penny Goodman’s tweet about a Secret Leeds Forum post on the North Bar Stone.  Here’s a photo of the Beating the Bounds walk around the stone for Terminalia Festival 2014 and the photo of the stone today. wpid-IMG_20140223_162651-500x375 IMG_1898 IMG_1895

    AnyGeoNew from Google – Maps Coordinate + Maps Engine Pro

    TweetGoogle has this week announced that Google Coordinate will now be included with every Maps Engine Pro subscription ($5/user/month). Recall, Google Maps Coordinate, is the mobile and web app that lets teams assign jobs and share their locations with each … Continue reading

    LiDAR NewsOregon Cities Partner on LiDAR Data Collection

    DOGAMI uses Quantum Spatial, a geo-mapping company with a Portland office, to gather the data. Continue reading →

    Click Title to Continue Reading...

    All Points BlogGIS Health News Weekly: Animal and People Pathogens

    Map Pathogens in Man and Animal for Prevention Researchers at the University of Liverpool's Institute of Infection and Global Health are building the world's most comprehensive database describing human and animal pathogens. Called the Enhanced Infectious Diseases (EID2) database it... Continue reading

    GeoServer BlogGeoServer 2.6-beta Released

    The GeoServer team is overjoyed to announce the release of GeoServer 2.6-beta.

    I hope you are enjoying the new website – the download page for 2.6-beta provides links to the expected zip, war, dmg and exe bundles. For this release we are experimenting with providing source downloads directly from the GitHub 2.6-beta tag.

    As a development release, 2.6-beta is considered experimental and is provided for testing purposes. This release is not recommended for production (even if you are excited by the new features).

    This release is made in conjunction with GeoTools 12-beta. Thanks to Kevin for making a beta release of GeoWebCache 1.6.0-beta with relatively little notice.

    What to expect … and what we expect from you

    A complete change log is available from the issue tracker. We will ask you to wait for 2.6.0 before we let Andrea write a pretty blog with pictures illustrating what features have been added. Instead 2.6-beta is my chance to ask you to download GeoServer 2.6-beta for testing.

    Testing is a key part of the open source social contract. The GeoServer team have identified a few areas where we would like to ask for help. This is your best chance to identify issues early while we still have time to do something about it. For those making use of commercial support ask your vendor about their plans for 2.6-beta testing. We would like to ensure the functionality you depend on is ready to go for a Q2 release.

    When testing Geoserver 2.6-beta please let us know on the user list (or #GeoServer) how it works for you. We will be sure to thank you in the final release announcement and product presentations.

    Java 7 Testing

    With Oracle retiring Java 6 security updates the time has come to raise the minimum bar to Java 7.

    We know a lot of downstream projects (such as OSGeo Live) have been waiting for GeoServer to support Java 7. Thanks to CSIRO, Boundless, GeoSolutions for providing Java 7 build environments allowing us to make this transition in a responsible fashion.


    • This is a major testing priority on all platforms.
    • Windows 7: The start.bat used by the run manually install has trouble running as an administrator. We recommend installing as a service of this release (GEOS-5687)
    • Mac: You will need to install Oracle Java 7 (as OpenJDK 7 is not yet available for OSX). We have not yet figured out how to run GeoServer.App with Java 7 (GEOS-6588) and are open to suggestions.


    WFS Cascade

    This is a really exciting change, swapping out our gt-wfs client code for a new gt-wfs-ng implementation with a new GML parser / encoder.  After comparing quality of the two implementations we decided to go all in with this transition .. and thus would really like your help testing.

    We would like to hear back on cascading the following configurations:

    • GeoServer
    • deegree
    • MapServer
    • tinyows - there is a critical fix about axis order in tinyows trunc. It corrects (finally!) the output … but perhaps not yet the filters?
    • ArcGIS
    • Other – any other WFS you are working with!


    • Pay special attention to the flags used for axis order. There are different flags to account for each way a WFS implementation can get confused. You will find some implementations expect the wrong axis order on request, but are capable of producing the correct axis order output.
    • We especially ask our friends in Europe to test WFS services published for INSPIRE compliance

    This was an epic amount of work by Niels and we have a couple of new features waiting in the wings based on the success of this transition.

    Curves support for GML and WMS

    A large amount of work has been put into extending the Geometry implementation used by GeoServer.

    We have experimented with several approaches over the years (including ISO 19107 and a code sprint with the deegree project) and it is great to finally have a solution. As a long time user of the JTS Topology Suite we have been limited to a Point, Line and Polygon model of Geometry. Andrea has very carefully extended these base classes in to allow for both GML output and rendering. The trick is using a tolerance to convert the the arcs and circles into line work for geometry processing.

    Testing for the 2.6-beta release is limited to those with Oracle Spatial. If you are interested in funding/volunteering support for PostGIS please contact the geoserver-devel email list.


    • Look for “Linearization tolerance” when configuring your layer.

    Advanced projection handling for raster

    We would like to hear feedback on how maps that cross the date line (or are in a polar projection) have improved for you.


    • No special settings needed


    Coverage Views

    We struggled a bit with how to name this great new feature, however if you work with raster data this is your chance to recombine bands from different sources into a multi-band coverage.


    • Use “Configure new Coverage view” when creating a new layer


    Startup Testing

    Yes this is an ominous item to ask you to test.

    GeoServer 2.6 simplifies where configuration files are stored on disk. Previous versions were willing to spread configuration files between the webapps folder, the data directory and any additional directories on request. For GeoServer 2.6 configuration files are limited to the data directory as a step towards improving clustering support and growing our JDBC Config story.


    • No special settings needed
    • Special request to check files that are edited by hand on disk (such security settings and free marker templates)


    Pluggable Styles

    For everyone happy with the CSS Style Extension we would like to ask you to test a change to the style edit page (allowing you to create a CSS or SLD style from the start).


    • Install CSS Extension and look for a new option when creating a style


    Wind barbs and WKT Graphics

    I am really happy to see this popular extension folded into the main GeoServer application.


    • Check GeoTools WKT Marks for examples you can use in your SLD file


    New Formats and Functionality

    We have new implementations of a couple of modules:

    • Printing – new implementation from our friends at MapFish
    • Scripting – includes a UI for editing scripts from the Web Administration Application

    A final shout out to ask for help testing new formats:

    • NetCDF
    • Rib
    • OGR

    Language Support

    We are happy to announce the first release having support for Turkish. Many thanks to Engin Gem and the whole translation team for the initial contribution. All modules, core, extensions, and community modules have been translated within 8 month. Great success!

    French, Korean, Polish, Romanian were corrected and updated to the latest developments. Thanks to all GeoServer Transifex translators and Frank for managing!

    Spot a translation mistake? Help translate here: GeoServer Latest localizations

    About GeoServer 2.6

    Articles and resources for GeoServer 2.6 series:

    Between the PolesCommercial-scale carbon capture and sequestration project begins construction

    About 37% of U.S. electric power generation is from coal-fired power plants.  New EPA regulations scheduled to come into effect July 1, 2015 will restrict CO2 emissions from power plants.  Coal-fired power plants will have to implement some form of carbon capture and sequestration (CCS) , convert to natural gas or some other cleaner fuel, or shut down.   CCS takes two forms, pre-combustion or post-combustion.  I blogged about a pre-combustion commercial CCS implementation in Kentucky that began operation earlier this year.

    CCS amine-based CO2 removalConstruction of the first U.S. commercial-scale post-combustion carbon capture and sequestration (CCS) retrofit has begun.  Post-combustion CCS technology will be installed at the coal-fired 240 MW Parish Generating Station in Houston, Texas.

    The Petra Nova project will capture 90 % of the plant’s CO2 emissions through an advanced amine-based process.  The process was piloted in a three-year project in Alabama.  The CO2 capture rate will result in lower greenhouse gas emissions than from a traditional natural gas-fired power plant.   The process involves scrubbing the flue gases with an amine solution, to form an amine–CO2 complex, which is then decomposed by heat to release high purity CO2. The regenerated amine is recycled to be reused in the capture process. The CO2 capture and compression system will be powered by a cogeneration plant comprised of a combustion turbine and heat recovery boiler.  The oil field will be monitored to verify that the CO2 remains underground.

    The captured CO2 will be compressed and transported via an 80-mile pipeline to increase oil output from an oil field with declining production.   After sseparation from the oil, the CO2 will be injected underground for permanent sequestration.