Planet Geospatial

GIS LoungeIs GIS Use in Crime Analysis Effective at Curbing Crime?

A new study by researchers at Sam Houston State University found that the use of GIS is widespread in analyzing crime data but that almost no studies have been done to determine if using spatial analysis is an effective tool in reducing crime rates.  “Geographic Information System Effects on Policing Efficacy: [...]

The post Is GIS Use in Crime Analysis Effective at Curbing Crime? appeared first on GIS Lounge.

LiDAR NewsGoin Fishin

I have a chance to go fly fishing with my son in Montana so I will be taking a few days off. Hold the fort... Continue reading →

Click Title to Continue Reading...

Directions MagazineGetting Out of Harm’s Way: GIS Tool for Estimating Evacuation Times from Tsunamis

DeLorme Professional WeblogIn Our Customer’s Own Words: Thank You for Helping to Save my Daughter’s Life

Below is an unedited letter from an inReach customer following the rescue of his daughter via helicopter from a remote area of Rocky Mountain National Park.

September 8, 2014

DeLorme:

I’m writing to thank you for your extraordinary efforts that helped save the life of my daughter, Amanda Sandstedt. We were deep in the backcountry of Rocky Mountain National Park when she had a diabetic emergency that required prompt hospitalization. Our DeLorme inReach gave us our only hope for her survival and it performed textbook perfect, and so did the SOS dispatch personnel [GEOS].

We were alone at a remote location when her condition began to rapidly degrade for unknown reasons. Realizing that I couldn’t get her out on my own and that her condition required advanced medical attention, I used the SOS feature on my inReach and the iPhone Bluetooth feature and we were immediately in contact with your communications center. Through the efforts of your dispatch center, the Flight for Life crew, and some remarkable treatment at St. Anthony’s Hospital in Denver, my daughter Amanda not only survived, but made a full recovery within a few days. The doctors and nurses at the hospital considered her recovery miraculous, as they initially did not expect her to survive. It was only through the efforts of all these people that she had any chance for survival, yet today we are at home and returning to our normal routine.

Several times I’ve started to write this letter to you, the Flight for Life crew, and the hospital, but it’s hard to know how to thank someone for the life of someone you love. I’ve thought about how our lives could have been horribly changed and then I thought about how our lives were actually changed, in the end, maybe for the better. Amanda is back at home, in school, back to work, and back to being full of life. She is doing very well. She remembers very little of the whole event, which may be a blessing. However, one thing she is fast learning is that our relationship will never be the same. 

I can’t thank you enough for the extraordinary efforts of your great people who dispatched the help that saved Amanda. As I think back about all the people’s efforts that came together perfectly to result in a happy ending to our Rocky Mountain trip, it’s clear to me what a miraculous story of survival this is. Her margin for survival was slim at best, even when she did reach the hospital. I feel very fortunate and blessed to still have my daughter who I love so much. Having her survive such a misadventure is a sobering reminder to me of just how precious each day is that we have together. It’s safe to say that I am forever changed by the experience. Seeing how quickly you can lose a loved one (or in this case, almost lose a loved one), it makes me want to hug each of my children every time I see them (although I’m not sure they appreciate the parental affection). I recommend the inReach to all my friends and especially to those who venture into the wilderness. I have used the inReach for several years, typically to check in each day and let my wife know where we are and how things are going. I never imagined that it would be the essential tool that would save my daughter’s life. By the way, the iPhone Bluetooth feature worked to perfection, providing a continuous text conversation where we could exchange information vital to Amanda’s rescue.

Below is a picture of Amanda, now at home and doing very well. I’m sure she’s not the first to be saved by this amazing device, but she’s clearly the most special to me. All the good people at DeLorme should take pride in the life saving communications that this device provides. I’ll be forever grateful for every day I have with Amanda and for each moment I spend with all my family.

Sandstedt Photo

Thank you for the life changing products and work that you do every day. I’ve learned through experience what an essential tool the inReach is – I’ll never leave mine at home. Thank you for saving a very special life.

– Chip Sandstedt

P.S.

Thanks again to you and the great people at the emergency dispatch [GEOS]. At a time when the stakes were very high, everything worked just like you’d hope it would. What a great tool to have for a trip into the wilderness.


Filed under: inReach, Rescue Stories Tagged: DeLorme, inReach, Rescue, Rocky Mountain National Park, SOS

Spatially AdjustedOpen Data…

This should be the mantra of any open data website:

Paul RamseyPostGIS for Managers

At FOSS4G this year, I wanted to take a run at the decision process around open source with particular reference to the decision to adopt PostGIS: what do managers need to know before they can get comfortable with the idea of making the move.

The Manager's Guide to PostGIS — Paul Ramsey from FOSS4G on Vimeo.

My Georamblings...Balloon Mapping & Raspberry Pi Workshop @ The Hacktory

There will be a Balloon/Kite/Pole Mapping using Raspberry Pi as the sensor base workshop on October 18th at The Hacktory.

The course description is:

Balloons, kites and poles are low cost, easy to use and safe methods for collecting aerial images and making maps. Aerial images are usually collected using satellites and airplanes, but these ground-based approaches provide an on-demand alternative to collect information when events or environmental conditions are occurring. This workshop will discuss grassroots mapping, lessons learned from the larger balloon mapping community, the components of a mapping kit and how you can get started.

If you are interested, register early, there are only 12 seats available for this workshop.

GeoIQ BlogLunch and learns! Office hours! Jack Dangermond! All this week.

This week is a busy week in DC! We have three events this week at our partner organizations, OpenGov Hub and 1776, including a meet up featuring Esri president and founder Jack Dangermond. Hope to see you there!

Open Data for the People: Mapping and Visualizing Government & NGO Data
Wednesday, September 17th from 12:30 PM to 2:00 PM at OpenGov Hub
Join us at OGH as we discuss how governments and NGOs can easily share their data in common formats, directly from the source. Citizens can then easily access and explore their authoritative data on ArcGIS Open Data. Washington DC government’s GIO, Tim Abdella, will be there to demonstrate the steps taken to launch DC’s newest open data site, opendata.dc.gov. Attendees will have a hands-on opportunity to explore ArcGIS Open Data and have the possibility to even launch their own site. Join us, and bring your lunch! This event is open to the public. RSVP on Eventbrite here.

Esri Day at 1776 Campus
Wednesday, September 17th from 10:00 AM to 3:00 PM at 1776
As a partner of 1776, we’re ramping up our support of local startups in DC. We welcome all members of 1776 to join us on Wednesday to learn how Esri and ArcGIS can help achieve your goals. We’ll have office hours in the morning and afternoon, as well as a catered lunch and learn. Andrew Turner will speak on startups using location technology and demonstrate how your startup can leverage Esri’s services. All levels of expertise are welcome!

Esri DC Meet Up – See the Latest in 3D, featuring Jack Dangermond
Thursday, September 18th from 6:00 PM to 9:00 PM at 1776
This month’s Esri DC Meet Up will feature developments in 3D! You will see the latest in 3D technology and how GIS can transform 2D GIS data into smart 3D city models. The use of 3D GIS allows decision makers to highlight new insights and improve decision making. Special guest Jack Dangermond, Esri’s founder and president, will be on premise for an exclusive Q&A session. Get all the answers you have about GIS and Esri this Thursday. This event is open to the public. RSVP on Meetup here.

——

Follow us on Twitter to keep up to date with DC R&D Center events.

GIS LoungeOrdnance Survey Launches the GeoVation Housing Challenge

The Ordnance Survey has launched new GeoVation challenge that offers £101,000 to the ventures that make the best use of geography, technology, and good design in order to offer solutions that address the identified problems of long term housing issues in the United Kingdom.  Looking to provide a way for people in Britain to [...]

The post Ordnance Survey Launches the GeoVation Housing Challenge appeared first on GIS Lounge.

All Points BlogPassive RFID is passé ... Not so fast my friend

In today's Directions on the News podcast, I suggested that the coming of sensors inside of the iPhone 6 and the advent of beacons plus other sensor technology might re-ignite enthusiasm for location-based sensors. This rebirth of sensors might potentially displace radio frequency... Continue reading

AnyGeoMake your own maps with Google Maps Gallery & My Maps

TweetSomething new from the Google Maps crew today – Google my Maps. By the end of this year, all maps created in classic Google Maps will automatically upgrade to the new My Maps, but to get started right away, open … Continue reading

It's All About DataAviation Weather: MeteoSwiss Enhances SIGMET & AIRMET Alerts with Maps

MeteoSwiss enhances aviation weather alerts with maps, using FME to interpret text-based reports and FME Server to automate production and distribute PDF results.

Aircraft flying in the night sky of lightningAs anyone who has ridden out turbulence on an aircraft can attest to, weather is a big deal for aviation. Big planes, little planes, helicopters – they all have to operate within their limits and either deal with or avoid weather that can affect a flight, whether the effects be just an uncomfortable ride or serious danger.

Aviation is global – and so too is aviation weather forecasting and reporting. To ensure clear communications and safe operations across borders, the International Civil Aviation Organization (ICAO) produces standards that all member countries (which is most of the world) use for communicating information to pilots, airlines, air traffic control, and other stakeholders. In the case of weather, these guidelines designate how and when information is shared.

A Bit About Aviation Weather Reporting

To the uninitiated, the “how” of weather reporting looks like some obscure form of shorthand (if you’re a pilot, you’ll want to skip ahead to the next bit):

BOSR WS 050600 SIGMET ROMEO 2 VALID UNTIL 051000 ME NH VT FROM MLT TO YSJ TO CON TO MPV TO MLT OCNL SEV TURB BLW 080 EXP DUE TO STG NWLY FLOW. CONDS CONTG BYD 1000Z.

Pilots, though, (and other aviation professionals) learn to interpret this code very early on in their training, and would know to look out for occasional severe turbulence, below 8,000 feet above sea level, within an area defined by four airports (MLT, YSJ, CON, and MPV) in the Maine – New Hampshire – Vermont area of the world, and that the conditions are expected to continue beyond 10:00 am UTC.

The “when” of it is also set out by ICAO. There are regular, predictably issued weather reports that are used for flight planning (that look a lot like the example above), and then there are SIGMETs and AIRMETs – which are special alerts issued when something significant occurs (like severe turbulence, or a rapidly forming storm, or volcanic ash) that needs immediate attention.

Consider an airliner on a ten-hour long-haul flight – a lot can change in ten hours, and these reports need to get to the pilots enroute for decision-making, particularly if a flight path change is called for.

Making It Better with Maps – and Automation

Over in Switzerland, Oliver Baer and his colleagues at MeteoSwiss are the ones responsible for producing and distributing this information. It seemed to Oliver that SIGMET and AIRMET reports, which are inherently spatial, would be greatly enhanced by adding a map view of the information – and that it should be something you could automate with FME Server. And so he turned to Certified FME Professional David Reksten of INSER in Lausanne to make it happen.

David developed a workflow that has three primary tasks – parse and interpret the report text, build the geometry, and create PDF output. FME Server automates it, running the tasks every five minutes and so producing new reports within minutes of new weather data being released.

Interpreting Text Reports

A TestFilter routes weather phenomena for attribute creation and eventual symbolization.

A TestFilter routes weather phenomena for attribute creation and eventual symbolization.

There are a few different kinds of details that need to be extracted from the report, and it’s ICAO’s standard coding that makes this possible. Location descriptions are parsed out to be used for geometry creation, and the workspace also checks which administrative boundary zones (FIR/UIR) will be needed to clip the locations.

Some feature preparation is done for eventual symbolization, including interpreting the phenomenon type.

Creating Geometry

In the example SIGMET report shown earlier, a polygon would need to be created by knowing the locations of each airport in the report. A number of countries, including Switzerland, are moving to using geographic coordinates instead – which makes geometry creation a bit clearer.

YUDD SIGMET 2 VALID 101200/101600 YUSO - YUDD SHANLON FIR/UIR SEV TURB FCST S OF N4300 AND W OF E02215 FL250/370 STNR WKN=

The polygons may be described by a single line (north of 43°N, for example), or two lines, like this example (south of 43°N and west of 22°15’E). There could be a diagonal dividing line, or even a multi-point polygon description – all provided by coordinates. All of these areas are ultimately clipped to the administrative boundary.

A polygon defined by one line of latitude and one of longitude, clipped to the administrative boundary (FIR).

A polygon defined by one line of latitude and one of longitude, clipped to the administrative boundary (FIR).

PDF Output

The final step is to assemble all the components in a PDF for distribution. The output features route into a pre-designed .mxd that handles symbolization and layout, and with a touch of Python the final result PDFs are written out.

This sample report shows two overlapping SIGMET alerts. Click for a high-res version.

This sample report shows two overlapping SIGMET alerts. Click for a high-res version.

The finished PDFs are made available by MeteoSwiss to end users via an FME Server data streaming service, and make their way into the hands of airlines, flight planners, and pilots within minutes, now including the mapped interpretation of the report for clearer communications. And that means faster and better planning on the ground – and decision-making in the air.

The workflow is also easily extendable to other regions – any region that is using geographical coordinate descriptions, in fact. “FME is incredibly flexible,” says David. “It makes developing, deploying, and maintaining data integration processes a breeze.”


 

Great footage of a jumbo landing in high crosswind conditions. (Disclaimer: some experts believe this video may not be authentic.)

Learn more about the technicalities of PDF streaming on FMEpedia:

Using the data streaming service to stream PDF

More on Aviation

From the 2014 FME UC: Runway & Taxiway Incursions (and FME Cloud)

KDOT Aviation: Keeping the Skyways Clear with FME (and Google Earth)

More on Weather

From the 2014 FME UC: Real Time Lightning Alerts from The Weather Network

The post Aviation Weather: MeteoSwiss Enhances SIGMET & AIRMET Alerts with Maps appeared first on Safe Software Blog.

Directions MagazineCity of Saskatoon Recognized for Increased Citizen Engagement through GIS

All Points BlogGoogle Open its Map Gallery to All; Re-renames My Maps

Google invites everyone to not only search its Map Gallery, but now to add to it. Those with Google Maps Engine or My Maps maps can turning switch sharing to "public" for access by anyone. When launched in February of this year the Map Gallery on included "selected" maps; now it will be... Continue reading

MapBriefOpen—Wide Open—in Portland: A FOSS4G Review

  Having been to previous FOSS4G conferences—2007 in Victoria, 2011 in Denver—I was eager to circle back to this year’s iteration in Oregon to assess what’s changed, what’s new, and what I can adapt for everyday use in my consulting business. I. Sprawl With up to eight tracks running simultaneously, everyone’s conference was going to [...]

Azavea AtlasSummer of Maps: Daytime Population Estimation and its Effect on Risk Terrain Modeling of Crime

This entry is part 6 of 6 in the series Summer of Maps 2014

Summer of Maps logo

Now in its third year, Azavea’s Summer of Maps Program has become an important resource for non-profits and student GIS analysts alike.  Non-profits receive pro bono spatial analysis work that can enhance their business decision-making processes and programmatic activities, while students benefit from Azavea mentors’ experience and expertise.   This year, three fellows worked on projects for six organizations that spanned a variety of topics and geographic regions.  This blog series documents some of their accomplishments and challenges during their fellowship.  Our 2014 sponsors, GoogleEsri and PennDesign helped make this program possible.  For more information about the program, please fill out the form on the Summer of Maps website.

 

When using Census data for research or analysis, sometimes the standard total population count for a region just doesn’t suffice. Transportation planners and crime analysts, for example, must account not only for residential populations but also “daytime” or “commuter-adjusted” population data, since many people spend most of their days working or running errands in different Census tracts, different towns, or even different regions from their homes. Nowadays we’re always on the go, so shouldn’t our population data reflect that?

I encountered this daytime population issue this summer while working as a Summer of Maps fellow with DataHaven to analyze the geographies of crime risk in New Haven, Connecticut. In this project, we used the Risk Terrain Modeling Diagnostics (RTMDx) Utility, a software application that uses crime data and the spatial influences of potential “risk factors” to model where conditions are ripe for crimes to occur in the future. One of the influencing factors of crime we used in our analysis was population density. While the exact effect of population density on crime rates is the focus of ongoing criminology research, in our study, we proposed that crimes would occur where high volumes of people were located. Since our focus here was on where people are “located” and not necessarily where they “live,” we incorporated commuter-adjusted population estimates to account for New Haven’s daytime population.

To acquire daytime population estimates, some data assembly is required. The Census Bureau provides instructions on calculating daytime population estimates using American Community Survey or Census 2000 population data. My first step in calculating daytime population was to download workplace geography data from Census Transportation Planning Products (CTPP), which includes Census data particularly useful to transportation planners. I selected “2006-2010 CTPP Tract-Tract work flow data” and followed the download instructions to get tract-to-tract population flow counts from residences to workplaces. I then queried the Access database to extract all records including a residence tract OR a workplace tract located in Connecticut to account for interstate commuting. With these statewide commuter counts, I was able to hone in on New Haven Census tracts and calculate the total number of workers working and/or living in New Haven. Lastly, I used the Census Bureau’s “Method 2” for calculating daytime population:

Total resident population + Total workers working in area – Total workers living in area

With both resident and commuter-adjusted population counts available, the next stage of the analysis was to incorporate this data into the RTMDx. I created risk terrain surfaces across four crime types (robbery, burglary, simple assault, and assault with a dangerous weapon) and two population counts (resident and daytime populations), producing eight risk maps in total. Each risk terrain model (RTM) included five risk factors in addition to the population count: foreclosures, bus stops, schools, parks, and job locations related to retail, entertainment, and food service (provided by Census LODES data via DataHaven).

In the figures below, we can compare the different risk terrain surfaces created for assaults using resident population and daytime population. The risk terrain surfaces are displayed with the “relative risk score” produced by the RTMDx Utility. To interpret this map, if an area has a risk value of 50, this means that the expected rate of assault in this location is 50 times higher than an area with a score of 1. The higher the score, the greater the risk of an assault based on the model.

image1

image2

 

In comparing the geographies of crime risk between resident and daytime population counts in central New Haven, we see generally higher risk scores when resident population is modeled. The heavily residential neighborhoods surrounding downtown New Haven see greater risk scores with resident population, perhaps owing to the fact that many residents here commute to jobs in other neighborhoods or cities during the day. Alternatively many of these neighborhoods, including Fair Haven, Newhallville, and Edgewood, see sharply increased risk scores when resident population is considered. The effect of population is more difficult to gauge in downtown New Haven, which is dominated by Yale University and Yale-New Haven hospital, the city’s two largest employers. Despite a much larger daytime than resident population, assault risk scores decreased when accounting for daytime population. This could be due to the nature of assault crimes in relation to population density, the geography of assault incidents in our crime dataset, the role of uncounted university students in influencing assault patterns, or other issues. Our results demonstrate that while daytime population is an important element to consider in risk terrain modeling, crime risk analysis remains a complex and inexact science.

While some spatial analyses may not require the granularity of daytime population estimates, using commuter-adjusted population data has important implications when exploring time-sensitive phenomena like crime or transportation dynamics. The Census may not be able to account for population spikes associated with university students, tourism or shopping, CTPP data still gets closer to understanding where people spend their days outside of the home.

Directions MagazineDigitalGlobe Webinar: Proteus Discusses Satellite-Derived Forest Inventory

LiDAR NewsGeomatics Field Test Facility

One of the ideas is an independent field test facility that could be the focal point for this entire effort. Continue reading →

Click Title to Continue Reading...

The Map Guy(de)Announcing: mapguide-rest 0.10

Nope, this isn't a 1.0 release. Not only are we not using decimal release numbers, but there's still plenty of things to explore and refine before we can put the 1.0 stamp on this thing. Here's what's new and changed in this release.

(Experimental) Cesium CZML support

We now have support for outputting feature data as CZML for consumption inside the Cesium 3D web viewer. Support for CZML is made available as a representation of a given Layer Definition.

For example, the following route will return data from the trees layer as CZML:

http://localhost/mapguide/rest/library/Samples/Sheboygan/Layers/Trees.LayerDefinition/features.czml

We've included a CZML example in this release that demonstrates the level of support that has been implemented for this release.


When selecting an object, its tooltip data will be shown in Cesium's information window if available.


Note the object itself has no selection indicator unless the object is a point. Still trying to figure out if we can apply a different style for selected lines and polygons.

If your Layer Definition has elevation settings applied, they be extruded if they're polygons


Now if the (experimental) tag didn't stand out, here's the current limitations with this implementation:
  • The following properties are preserved when converted to CZML, anything not listed can be assumed lost in translation:
    • Point styles: Point color is preserved. The point size in CZML is the average of the height/width of the point as defined in the Layer Definition.
    • Line styles: Line color
    • Area/Polygon styles: Fill color. Outline color
  • If a Layer Definition has multiple scale ranges defined, mapguide-rest will only consider the first scale range when outputting CZML
  • If a given point/area/line style is themed, the default rule (the one without a filter) is ignored
Data publishing improvements

The restcfg.json now lets you specify a Layer Definition as the data source instead of a Feature Source and Feature Class name.

Also, when using a Layer Definition as a data source, you'll get tooltip, hyperlink and elevation FDO expression pre-evaluated, allowing you to use such computed properties within your templates.

You can find a new example that uses the building footprints from the Melbourne dataset.

File download support

Most GET routes can now prompt for downloads by appending download=1 to the query string of the URL.

XYZ tile improvements

Vector tiles can now be for a base layer group or for a layer within the base layer group.

Just to recap, this URL route fetches a vector tile for a given base layer group in a Map Definition

http://servername/mapguide/rest/library/{resourcePath}.MapDefinition/xyz/{groupName}/{z}/{x}/{y}/tile.{format}

If you want vector tiles for a specific layer within that group, you can use this new URL route

http://servername/mapguide/rest/library/{resourcePath}.MapDefinition/xyz/{groupName}/{layerName}/{z}/{x}/{y}/tile.{format}

You can find a new Leaflet example that uses the single-layer vector tiles.


Other changes

  • Fixed bad download links in the resource data list HTML representation
  • Fixed invalid chunked file transfer behavior under certain conditions
  • Samples updated to use OpenLayers 3 final and Cesium 1.1
Download

Directions MagazineSPAR Europe 3D Measurement & Imaging Conference 2014 Programme Posted

Directions MagazineSam Houston State study examines use of GIS in policing

Directions MagazineNew Oregon Maps Feature National Scenic Trails

Directions MagazineAvenza Releases Geographic Imager 4.5 for Adobe Photoshop

Directions MagazineApple, Sensors and Geodata Overload

Sean Gillies BlogPython at FOSS4G 2014

Python at FOSS4G 2014

There were plenty of other Python talks at FOSS4G and I plan to watch them when the videos are online (update: talks are appearing now at http://vimeo.com/foss4g). I haven’t been aware of ogrtools, which is unlucky because there’s plenty of functional overlap between it and Fiona. The designs seem rather different because Fiona doesn’t emulate XML tool chains (GDAL’s VRTs are not unlike XSLT) and is more modular. For example, where ogrtools has a file-to-file ogr translate command, Fiona has a fio dump and fio load pair connected by a stream of GeoJSON objects. The ogrtools talk is right near the top of my list of talks to see.

I was very fortunate to go right after Mike Bostock’s keynote. It got people thinking about tools and design, and that’s exactly the conversation that I’m trying to engage developers in with Fiona and Rasterio, if with less insight and perspective than Mike. I reminded attendees that the best features of our day-to-day programming languages are sometimes disjoint and showed this diagram (in which C is yellow, Javascript is magenta, and Python is blue. By “GC” I mean garbage collection and by “{};” I mean extraneous syntax).

http://sgillies.github.io/foss4g-2014-fiona-rasterio/img/py-js-c.png

D3 embraces browser standards and all they entail (a world wide knowledge base and continuous performance improvements) and Fiona and Rasterio embrace the good parts of Python. Written as C, like we usually see in GDAL/OGR examples on the web, Python is quite slow. Idiomatic Python, including the good parts like list comprehensions, generators, and iterators, is dramatically faster. While Fiona and Rasterio don’t do particular operations faster than the older GDAL and OGR bindings (because it’s the same C library underneath), they are designed from the bottom up for a good fit with more efficient idiomatic Python code.

I plugged Click and Cython in my talk, too, and discussed them afterwards. I found tons of interest in Python at FOSS4G and lots of good ideas about how to use it.

I confess that I didn’t pay a lot of attention to the talk schedule before the conference. My summer was kind of nuts and I don’t subscribe to any OSGeo lists. When I did look closely I was surprised to find that many people were giving two talks and some three. If any woman or first-timer didn’t get a chance to speak while some dude got three (and the multiple talkers were all men and long time attendees as far as I can tell) – that’s a bug in the talk selection that needs to be fixed before the next edition.

Lastly, I think the views of Mount Hood you get when flying in and out of PDX to destinations south and east are worth the airfare all by themselves.

https://farm6.staticflickr.com/5587/15249959145_91e47b3444_c_d.jpg

AnyGeoThe Nation is Calling for 3D Data – via the USGS

TweetThis from the USGS… The 3D Elevation Program (3DEP) is a national initiative to accelerate the collection of 3-dimensional elevation data, to manage the authoritative lidar and ifsar datasets, and to provide elevation products and services to everyone for applications ranging from … Continue reading

Directions MagazineLandWorks Adds Linear Project Routing to Online Property Mapping Service

GIS LoungeMapping Global Carbon Dioxide Emissions

Researchers recently published the results of developing a system for measuring global carbon dioxide (CO2) emissions in the Journal of Geophysical Research.  Named “Fossil Fuel Data Assimilation System” or FFDAS, the system is able to quantify at the city level fifteen years of CO2 emissions from 1997 – 2009.  The research represents [...]

The post Mapping Global Carbon Dioxide Emissions appeared first on GIS Lounge.

BoundlessThe Spatial IT Job Board

CareersThose of you who’ve seen Paul Ramsey’s Spatial IT and the Spatial Web presentation or read Michael Terner’s blog piece know that we see a great future for software developers and IT professionals with an interest in spatial technology. Below are several job openings we’ve seen in the past month.

Job Listings

There are plenty of opportunities for Spatial IT professionals but if we missed any relevant positions please contact us and we’ll be sure to include them in future job board posts.

The post The Spatial IT Job Board appeared first on Boundless.

Sean Gillies BlogBack from FOSS4G

Back from FOSS4G

In my experience, FOSS4G was tons of fun and very well run. Chapeau to the organizing team! I hope other attendees got as much out of the conference as I did. Not only did I get to catch up with people I met at the dawn of FOSS4G, I met great people I’d only known from Twitter and made entirely new acquaintances. I even got to speak a bit of French.

My talk was one of the first in the general sessions. I had fun presenting and am told that I did a good job. My slides are published at http://sgillies.github.io/foss4g-2014-fiona-rasterio/ and you can fork them from GitHub. According to the information at the FOSS4G Live Stream page all the talks will be available online soon. I missed plenty that I’m looking forward to seeing on my computer. Out of the ones I attended, I particularly recommend seeing the following:

  • “Using OpenStreetMap Infrastructure to Collect Data for our National Parks” by James McAndrew, National Park Service
  • “Managing public data on GitHub: Pay no attention to that git behind the curtain” by Landon Reed, Atlanta Regional Commission
  • “Big (enough) data and strategies for distributed geoprocessing” by Robin Kraft, World Resources Institute
  • “An Automated, Open Source Pipeline for Mass Production of 2 m/px DEMs from Commercial Stereo Imagery” by David Shean, University of Washington

Did the code of conduct work? I heard one speaker invoke images of barely competent moms – “so easy your mother can do it” – and was present for a unfortunate reference to hacking private photos at lunch time. I hope that was all of it.

If you attended FOSS4G or watched the live feed I encourage you to write about your experience and impressions. Come on, do it. It doesn’t have to be long or comprehensive. Here are a few blog posts I’ve seen already:

Directions MagazineValtus Engages eMap International as Newly Authorized Sales Agent Partner

Directions MagazineComporium Selects Ubisense Enterprise Location Intelligence Solutions

Directions MagazineThinknear Report Reveals Industry Improvement in Location Accuracy

Directions MagazineMAPPS Announces Judges for Eighth Annual Geospatial Products and Services Excellence Awards

LiDAR NewsMaking Effective Use of 3D Models

We are making progress on acquiring the data, but there is a lot of work to be done to make it useful after the construction is completed. Continue reading →

Click Title to Continue Reading...

Directions MagazineLizardTech Teaming up with Directions Magazine to Talk About Developing an Aerial Image Workflow for UAVs

Directions MagazineEsri Canada Wins Gold at CDN Channel Elite Awards

Directions MagazineIntergraph Participates in CityNext

Directions MagazineColin Thomson Joins 3D Laser Mapping as Technical Director for Mining & Monitoring

Directions MagazineA Global View of Earth and the Environment: An Interview with Barbara Ryan, Director, Group on Earth Observation (GEO)

Directions MagazineA Global View of Earth and the Environment: An Interview with Barbara Ryan, Director, Group on Earth Observation (GEO)

VerySpatialA VerySpatial Podcast – Episode 478

thematic mapping blogGeotagging photos using GPS tracks, ExifTool and Picasa

I take a lot of photos while trekking, and most of the time I'm also carrying a GPS with me. As my camera don't have a built-in GPS, my photos are not geotagged while shooting. Luckily, this is an easy task if you've kept your GPS logs from the trip. 

I'm still very happy with my Garmin GPSmap 60CSx that I bought 7 years ago. By changing the setup, the GPS allows me to automatically save the tracks to the memory card. I get one GPX file for each day trekking named with the date. I can easily transfer these tracks to my computer or smartphone with a cable or a card reader. 

Before I converted to Mac, I used GeoSetter to geotag my photos on Windows. Now, I want to do it on the command line using the great ExifTool by Phil Harvey. I installed it on my MacBook using Homebrew:

brew install exiftool

After copying my GPX file to the image folder, I'm simply running:

exiftool -geotag=my.gpx ./

If you forgot to sync the camera and GPS time before your trip, you can use the geosync-option to fix it: (60:00 = 60 minutes):

exiftool -geotag=20140329.gpx -geosync=-60:00 ./

You have a lot of options, so make sure to read the "Geotagging with ExifTool" documentation. ExifTool is modifying the Exif headers of your image files, storing the location data in the same file. 

To see the result on a map, I'm using Picasa.  


Click the map pin button (bottom right) to see the map. If the positions are not shown on the map, try to right-click the image folder and select "Refresh Thumbnails". 

If you don't have a GPS track you can always use Picasa to manually geotag your photos. 

Be aware! I just learnt that social media sites like Facebook, Twitter and Instagram removes the Exif data from your images. Google+ don't. 

Now, how can you display the photos on your own map? It will be the topic of my next blog post. 

Open Source Computing and GIS in the UKOSGIS 2014

Two weeks ago now saw the return of the OSGIS conference in Nottingham, after a year off in 2013 for FOSS4G. I think there had been mixed feelings about this event; those of us heavily involved in the organisation of FOSS4G 2013 had taken a back seat this year, and with FOSS4G 2014 imminent in Portland, it was clearly going to be a smaller scale get together.

I have to say that overall, my impression is that small is good! Small allows you to chat to everyone, see everything you want to see, and generally enjoy, rather than rush around like a mad man or woman. It was nice to see some new faces, and to see a number of papers from local government and business, belying the idea that OSGIS is primarily an academic event. Thanks as always to the chaps at Nottingham for organising.

Astun had a strong showing at the event, with two workshops and two presentations. My colleague Matt Walker did a workshop on OpenLayers3 and Leaflet, and I did one on WPS and PgRouting (a beginners guide). I did a quick introduction to Portable GIS, and another colleague Antony Scott did a comparison to web servers. You can see the workshops at the Astun Technology GitHub pages. As a slight techy aside, Matt and I collectively decided to try GitBook for preparing our workshops, and we’re both very impressed. If you’re interested in a cheeky workflow for pushing a gitbook directly to GitHub gh-pages, see this gist.

Steven Feldman, another Astun employee/adviser also did a talk entitled “There’s no such thing as a free lunch”, a continuation of an emerging theme in open source geospatial at the moment on getting companies to contribute more, or at the very least acknowledge and thank the open source components they use. As always this was very thought provoking (although I worry a little that making people feel responsible for the software they use might potentially back fire)– you can find Steven’s blog post on the talk here.

OSGIS has always had a strong relationship with the OSGeo:UK local chapter, but unfortunately that’s been a bit inactive since FOSS4G- sometimes it’s hard to know what to do next when you’ve fulfilled one of your primary goals! We’re going to have a go at rebooting the local chapter though now- a more detailed post on this will follow soon. The short version is, I’m back co-chairing after a two year absence, and we’ve got some good ideas going forward. Watch this space!

LiDAR NewsHandheld Scanner Supports Forest Applications

Using the ZEB1 they achieved significant advantages in speed of data capture, quality of the resulting point cloud and ease of use of the system. Continue reading →

Click Title to Continue Reading...

GeoServer BlogJava Code Sprint

Day 1 Quality Assurance

Thanks to the foss4g and WhereCampPDX for

Automated Testing with CITE Team Engine

Andrea and Justin have lead the charge updating the GeoServer CITE tests (see Cite Sprint).

The goals are initially modest: enable developers (other than Justin) to setup and run CITE tests.

Justin is working on updating our “easy to use” CITE test harness build, while the others are hitting the latest version of the CITE tests and checking both ends, tests and GeoServer, for errors (and finding issues on both sides): Andrea Aime is working WCS tests, Mauro Bartolomeoli on the WFS ones, Jared Erickson and Brad Hards on the WMS ones.

GeoServer Manual Testing with 2.6 Nightly

Our plea to test 2.6-RC1 was not incredibly successful, so we are in for a bit of manual testing:

Thanks: Cristiane Andrioli, Flavio Conde, Ivan Martinez

Random fixes:

  • Update link for nightly release (Jody)
  • Nightly build is running again (Justin)
  • Documentation for weather symbols, custom WKT symbols, bulk custom WKT geometry
  • Testing of marlin, install instructions, package as geoserver extensions (Chris Marx, Ian Turton)
  • Testing Oracle+Curved test (Ian Turton)

GeoServer Day 2

If you would like to join us tomorrow:

  • Sunday, September 14th: Boundless is arranging facilities at nedspace from 10am-4pm.

GIS LoungePresentations from OSGIS 2014 Now Online

All recordings from the recently held OSGIS 2014 are now available online.  Themed, “Building up Open Access, Open Education and Open Data for Open Science”, the fifth open source GIS conference is part of the  “Geo for All” initiative.  The conference featured keynotes by Professor John Wood (Secretary General, Association of Commonwealth Universities), Karen Parkin [...]

The post Presentations from OSGIS 2014 Now Online appeared first on GIS Lounge.

The Map Guy(de)300,000 views!

In the race between 300,000 page views and 350 blog posts, the page views crossed the finish line first. For the record, this is my 348th post.

Thank you all for your continued viewership!



GIS LoungeMapping Forest Disturbance with Landsat

The Landsat Ecosystem Disturbance Adaptive Processing System (LEDAPS) takes advantage of the 30 year Landsat archive to inventory recent disturbances and forest-cover change. Using mid-summer, cloud free Landsat data from the Global Land Survey (GLS) project, LEDAPS first corrects the images to remove atmospheric effects from surface reflectance (source code for LEDAPS) before applying [...]

The post Mapping Forest Disturbance with Landsat appeared first on GIS Lounge.

thematic mapping blogGeotagging and Picasa Web Albums API, or was it Google+ Photos?

In my last blog post, I presented a new plugin, Leaflet.Photo, that allows you to display geotagged photos from any source. Among them was Google+ Photos and Picasa Web Albums API. My plan is to use this API for my travel map, and this is why.

Does Picasa Web Albums still exist? 
It's a bit messy these days. Google is trying to transition from Picasa Web Albums to Google+ Photos, as photos are the number one things that people want to share on social networks. When you use Picasa to share your albums (Sync to Web), the album URL is now on your Google+ profile, and not on Picasa Web Albums (which is just redirecting me to Google+). This is the URL to the public album from my trip to the Trollfjord:

https://plus.google.com/photos/+BjørnSandvik/albums/6052628080819524545

It also works with your Google+ user id:

https://plus.google.com/photos/118196887774002693676/albums/6052628080819524545

My public Google+ web album. The album contains both photos and videos. 

The thing is, there is no Google+ API for photos and videos yet (apparently they were working on it back in 2011). But the Google Web Albums API still works on your Google+ albums.

Google Web Albums API is not the easiest API I've worked with, but it's flexible and quite fast. This is an XML feed of my public album from Trollfjord:

https://picasaweb.google.com/data/feed/api/user/118196887774002693676/albumid/6052628080819524545

The user number and album id is the same as above. Or better for your JavaScript apps, a JSON feed:

https://picasaweb.google.com/data/feed/api/user/118196887774002693676/albumid/6052628080819524545?alt=json

And if you're still using JSONP:

https://picasaweb.google.com/data/feed/api/user/118196887774002693676/albumid/6052628080819524545?alt=json-in-script

If you click on any of these links, you'll see that it's not a very compact format. There is a lot of data that you don't need. Although complicated, you can select the fields you want to include in the feed. This is how I selected the following elements:
  • Photo URL: entry/media:group/media:content
  • Photo caption: entry/media:group/media:description
  • Photo thumbnail URL: entry/media:group/media:thumbnail
  • Photo timestamp: entry/gphoto:timestamp
  • Photo location: entry/georss:where

This is the new URL:

https://picasaweb.google.com/data/feed/api/user/118196887774002693676/albumid/6052628080819524545?alt=json&fields=entry/media:group/media:content,entry/media:group/media:description,entry/media:group/media:thumbnail,entry/gphoto:timestamp,entry/georss:where

While researching, I also learnt that I could use the imgmax attribute to specify the size of the photos referenced in the photo URL. Neat!

So why should I use this (relatively) old API?
Compared to other popular social media sites, Google don't strip off the meta information of your photos. Instead it uses the build in support for image metadata extensively. Hopefully Google will continue to do this, although social media sites have reasons not doing so.

This means that Google don't lock you in. I can change the location of my photos using my GPS tracks, and it's reflected where I embed my photos. I can edit the image captions in Picasa and it's stored within the image file, allowing me to write the caption once and use it everywhere.

So what is my album workflow for my travel map. Before starting my journey, I'm creating a new Google+ album. The feed from this album is attached to my map, by simply passing on the album id. While on journey, I use the Google Photos app to add photos to the album, that will automagically show up on the map as well. Back from trip, I can add and edit photos from my digital camera in Picasa and sync them to the web album.

Update 12 September 2015: I'm having trouble uploading images with the Google photos app. The images are geotagged and the location data shows in the Google+ album, but unfortunately the location data is not included in the API feed. Please notify me if you're able to get this to work. 

Photos from Google+ shown on my travel map. 

PS! This blog post is not sponsored by Google :-) 

All Points BlogGIS Health News Weekly: Florida’s New Food Map, Asthma and GPS, California’s West Nile Cases

Florida Roadmap to Healthy Living The Florida state Agriculture Department launched an online map (right) identifying "food deserts" and pockets of residents with nutrition-related health problems to help in response. Florida's Roadmap to Living Healthy is the first use of GIS to... Continue reading

Directions MagazineNew Editor for CaGIS Journal

Directions MagazineMAPPS Selected for FAA Working Group on UAS

Between the PolesBalancing electricity demand and intermittent renewable generation

Over the past decade,distributed generation, especially solar photovoltaic (PV),  consumer demand response programs, and other flexible distributed resources of electric power has grown dramatically.  Until now, with a few exceptions such as Germany and Hawaii, distributed intermittent resources have represented a relatively small proportion of total power generation.  However, as we face the prospect of scaling the use of flexible distributed energy resources including affordable energy storage batteries, attention is focussing in many jurisdictions not only on the economics of distributed energy, but also on the control system that will balance intermittent micro-generation resources and consumer demand and the rapid evolution of new consumer devices, aka the Internet of Things.

Renewable energy sources like solar and wind power are being used more and more worldwide, while the market share of conventional power stations is decreasing. Wind and solar are intermittent sources of power and balancing these power sources and consumer demand becomes a serious challenge when intermittent  represent more than about 20% of total demand.  For example, distributed generation with many small sources of power feeding energy at medium and low voltage levels can reverse the load flows from lower to higher voltage levels.

All of these changes affect the provision of system services which balance supply and demand. For example, conventional power stations not only provide most of the balancing energy required in the system, but the inertia of their generators also guarantees the provision of instantaneous reserves for immediate frequency support. Other important system services include voltage maintenance, operation management and re-establishment of power supply.

Germany's Energiewende

German electric power generation by fuel type WSJ OG-AC404_ENERGI_G_20140826181505The decision to shutdown its nuclear power plants in favour of renewable energy (and increased use of coal) is fundamentally changing the supply of energy in Germany.  According to Fraunhofer ISE in the first half of 2014 solar and wind power plants together produced more than 45 TWh or approximately 17% of Germany's net electricity generation. All renewable energy sources including hydro and biomass produced a total of about 81 TWh and accounted for approximately 31% of German net electricity production.

For example, the Unterfränkische Überlandzentrale eG (ÜZ) in Lülsfeld is a medium sized utility company in Northern Bavaria. It operates a typical regional distribution network in Germany - a large area with many small towns and villages and plenty of space for renewable plants. In 2013 its total electricity demand was 497 million kWh of which renewables had a share of more than 285 million kWh which is more than 50% of electricity demand.

Dena logoTo address the challenge of balancing generation and demand with increasing intermittent energy sources distributed over larger areas, the Deutsche Energie-Agentur GmbH (dena) - the German Energy Agency - commissioned a study to determine the scope of grid system services in the context of an increasing supply of intermittent energy.  The dena Distribution Grid Study is a detailed examination of the need for expansion and conversion in the German electricity distribution grids based on two alternative expansion scenarios for renewable energy sources. The results document a significant need for expansion by 2030. It also analyses technical options for reducing grid expansion requirements.  It found that the use of innovative grid operational resources, the adaptation of technical guidelines and down-regulation of generation peaks of decentralised generation systems could reduce the need for grid expansion.

Sacramento Municipal Utility District

In Sacramento, California, some neighborhoods already generate much more power than they use and send power back onto the grid. Sacramento Municipal Utility District (SMUD) engineers had thought this would cause problems in balancing load with generation and raise the risk of damaging equipment. But so far that hasn't happened.  SMUD has a working group on solar issues that is looking at system enhancements that might be necessary to handle more local solar on the system in the future. Estimates  of SMUD's solar potential are that it could potentially produce around 1,400 MW. SMUD experiences a minimum load of 800 to 900 MW and a peak load around 3,300 MW. SMUD engineers expect that there is an "optimal" amount of solar power to have on the system that is probably less than the maximum that is technically feasible.  In addition utilities have to make provision for what happens when the sun doesn't shine during peak load periods.  To provide backup power, building so-called gas-fired peaking plants are expensive because they typically are idle most of the time. 

Changing the utility price structure for rooftop solar PV power

One way of addressing the balancing issue is to change the price structure that a utility uses to pay for solar power generated by its customers.  The Salt River Project (SRP) has about about 9000 solar customers representing about 800 MW of rooftop solar PV capacity.   Mark Bonsall has pointed out that in the traditional utility model, comparing variable and fixed costs with fixed and variable revenue shows there is a mismatch, which translates into a revenue shortfall or unrecovered cost when comparing a non-solar customer's with a solar customer's annual bill.  From Mark Bonsall's perspective, the problem is not technology, it is the price structure, which he feels has to change.

DSC04456abA graph of the PV power output from 800 of SRP's solar customers shows that the peak output varied from 11 am for some customers to 4 pm for others.  SRP's demand peak is around 6 pm, when everyone goes home from work and turns on their air conditioners.  Mark Bonsall suggested that SRP's price structure should encourage solar customers whose PV output peaks closer to SRP's demand peak.  This means a price structure based on demand (kW) not just energy (kWh), which is a different business model from that used by utilities today.

Energy storage

The renewable + energy storage market is getting interesting and has been forecasted to grow to $2.8 billion in 2018.  Solar City offers a small scale residential and commercial solar+battery system which uses Tesla batteries and Tesla has just announced a $5 billion battery factory to be completed by 2017.

The development of combined solar PV and batteries from Tesla and others promises to make solar + storage accessible for increasing numbers of consumers.  With storage solar consumers could form their own microgrid, either by themselves or with their neighbours and disconnect from the grid.  Alternatively they could become a source of dispatchable power and become an energy provider.  Increasingly they could find that either of these options is economically advantageous.  This has serious implications for local utilities because if this trend develops it could seriously erode the traditional utility revenue base.   It could also lead to a completely decentralized grid comprised of many microgrids.  From an operations perspective it increases the complexity of managing the grid compared to the centralized model in use today.

Directions MagazineGlympse Debuts New Android Custom Keyboard for Instant Location Sharing

Directions MagazineSurrey Satellite US Wins NASA Goddard Contract for Landsat Instrument Study

Directions MagazineCadcorp announces new software release including free-to-use desktop GIS

My Corner of the WebRails Best Practices

Bob Roberts:

A couple of these tricks I wasnt aware of like the find_each. Thanks for the write up ;)

Originally posted on Rails Best Practices:

With the fast pace of today’s agile development industries, we know how important it is to complete a project on time. We also recognize the importance of other factors such as flexibility and readability & most important performance of the application.

 Many times even experienced developers do not consider above points, which wont impact during initial stages of application, but will raise problems when the data in application grows exponentially and enhancement is to be done for existing application.

 Below are some best practices in Ruby on Rails, which should be considered at time of development in Rails framework.

 * Use Eager Loading (Prevent N + 1 queries)

 Most of the time new Rails developer not use eager loading of object in Rails.

 Eager Loading is highly recommended at the time of development in Rails.

It mainly resolved the common issues like N + 1 queries.

We can detect the N…

View original 1,165 more words


geoMusingsPersonal Thoughts On the AppGeo Announcement

I read with great interest today’s announcement that AppGeo is no longer an Esri Business Partner. I find the announcement significant for a number of reasons, which I will explore shortly. I have always respected AppGeo’s work. As a small business that does geospatial consulting, they have foregone the “grow at all costs” approach that is seen all too often in the consulting world. They generally stuck to what they do well and branched out conservatively in ways that tie logically back to their core business.

I first met the President of AppGeo, Rich Grady, at an early HIFLD meeting many years ago. (It may have even been before the group was called “HIFLD.”) The work they were doing then was very relevant to critical infrastructure protection efforts and, had some of their concepts for sharing data between state, local, and Federal agencies been adopted, we’d probably be better off today. I have always considered Rich one of the good guys in the geospatial industry and the company he has built reflects his integrity.

Over the years, our companies haven’t quite found the right vehicle to work together and we sometimes even compete against each other. That’s the nature of the consulting business. You will often compete against friends and still be able to have dinner together later.

So I was happy for Rich and AppGeo when I read their announcement. As I said above, I found it significant in a few ways…

Professionalism

The announcement was a textbook example of the professionalism with which such matters should be handled. AppGeo did not burn down the house in announcing they were no longer a business partner. There were no declarations that “Esri sucks” or any other such unnecessary hyperbole. They discussed real, significant differences that had evolved over time with Esri that lead to an agreement that their business interests had diverged to the point that a Business Partner status no longer made sense. While Esri initiated the final break, AppGeo handled it well, stated their case professionally, and gave no ground. Well done. This announcement should be required reading for the various factions and camps in our industry.

A Sign of Maturity

I found the announcement significant because I see it as evidence that AppGeo, as a geospatial business, has evolved to a point of maturity that most businesses never reach. They have built a diverse portfolio and realize they can go it alone without the safety net of Redlands under them. I will note that my company is an Esri Business Partner so I guess we haven’t gotten there yet. I have worked as a consultant for my entire career and I believe to my bones that my primary job is to recommend the most appropriate solution to my customers. That is quite often an Esri solution for a lot of valid reasons, but not always. Right now, I am working on a project for a Federal agency based entirely on open-source geospatial tools. Often, a hybrid solution is really the best fit. The point is that, as a consultant, my job is to first gather data, analyze it, and determine the best solution. My job is not to walk in with a pre-conceived notion. If you call yourself a consultant and never look outside the Esri stable of tools (or whomever your preferred vendor may be), then you are not a consultant but a salesman. Just own it.

AppGeo recognizes this and has built a consulting practice that can proceed forward with less influence from large vendors. I commend them for that.

Maturity of Open-Source Geospatial

This next observation is statistically spurious as it is based only on this particular case, but I see this announcement as a sign of the increased maturity of the open-source geospatial segment. AppGeo is still a fairly small business, and many small consulting shops tend to be very conservative in their business moves. The fact that they are willing to decouple themselves from Esri indicates that they are not only confident in their capabilities with a diverse tool set, but that they are also confident in the stability and reliability of the tools in terms of being able to stake their business on them. Those of us who work with open-source geospatial tools are not surprised by this but it is important to note that AppGeo is as deeply proficient with the Esri tool suite as any company I’ve seen. I am not privy to their internal decision-making process but, at some point, a comparison must have yielded the conclusion that open-source tools (and Google) were sufficient to stand on equal footing in their capability portfolio. When a once-marquee partner like AppGeo reaches that conclusion, I find it significant.

Those represent some of my thoughts after reading the announcement. I recognized a lot in it but I can’t say I’ve had all of the experiences they describe. I can’t say that I’ve ever received any poor treatment from Esri as a Business Partner. I have heard enough stories from others that I don’t doubt that it happens but I have never personally experienced it. Esri has always been exceedingly professional with me and my company. I don’t feel like I’ve ever felt undue pressure to push their products to the front of the line but I recognize that my experience may be atypical.

In many ways, I expect all vendors to be partisan about their products, be it Esri or Microsoft or Oracle or Boundless or whomever. I would be suspicious if they weren’t. Perhaps, because of that expectation, I’ve developed a bit of a filter for it but it doesn’t bother me. My job as a consultant is to stand between my customers and the vendors and use my experience to sort through the market blather, find the real data, help my customers make an informed decision, and then help them implement. I have always felt that proficiency with a broad set of technologies makes me a better consultant, even for my strictly-Esri customers. AppGeo clearly understands this better than most and I commend them on reaching this significant and positive milestone.

'sprokeUseful tools: Oracle SQL Developer Data Modeler

Oracle SQL Developer Data Modeler is a useful tool for database design that supports building logical and physical models.

To run in OSX Mountain Lion, it needs Java1.7 for OSX.

Footnotes