Planet Geospatial

VerySpatialDigitalGlobe introduces PERSPECTIVES Magazine

DigitalGlobe, a commercial high-resolution earth imagery company, launched its aptly named e-magazine, PERSPECTIVES,  today. The trade magazine provides 52 pages of stunning imagery and detailed information on the satellite imagery and remote sensing industry. Although it focuses on DigitalGlobe technologies, the magazine provides insight into a broad swath of topic areas from mineral exploration to infrastructure to penguin migration.  PERSPECTIVES will provide articles, case studies, and technical papers in their upcoming issues.


Screen shot 2014-10-01 at 9.35.47 AM

LiDAR News“Angkor Revealed” on Smithsonian Channel

the Smithsonian Channel will present a two part series on the investigation of an ancient Cambodian civilization. Continue reading →

Click Title to Continue Reading...

Directions MagazineGlobal Mapper LiDAR Module Released with Feature Extraction to Create 3D Buildings and Trees

All Points BlogGIS Government Weekly News: Chicago Open Data Scripts, Crime Mapping, Chinese World Land Use Data

City of Chicago Launches OpenData ETL Utility Kit The City of Chicago’s Department of Innovation and Technology released the OpenData ETL Utility Kit at the Code for America Summit this morning. The ETL Utility Toolkit will give cities the same tools Chicago uses to get data from... Continue reading

GeoServer BlogGeoServer at FOSS4G

Thanks to PDX-OSGeo local chapter and the organising committee of FOSS4G for a great event. We had a great time meeting everyone, enjoyed an excellent and diverse program, and productive weekend code sprint.

Marcus Sen and Jody Garnett have kindly gathered up links to GeoServer content.

Group Talks:

GeoServer Talks:

Comparative Evaluations:

Vertical Integration:

Misc / GeoTools:

Directions MagazineSimActive Introduces Game-Changing Technology for Mosaic Creation

Directions MagazineHow CSX Used Location Technology to Mitigate Field Risks and Reduce Costs

GIS LoungeGLOWABO – Remotely Sensed Inventory of the World’s Lakes

Up until now, there have only been estimates of how many lakes there are in the world.  Four researchers from France, Sweden, Estonia, and the United States have attempted to answer long-standing question of, “how many lakes in the world” using remote sensing techniques. The team applied an automated algorithm [...]

The post GLOWABO – Remotely Sensed Inventory of the World’s Lakes appeared first on GIS Lounge.

Making Maps: DiY CartographyGilpin’s Map of the Isothermal Zodiac and Axis of Intensity Round the World + Calcareous Plain & Maritime Selvage, Etc., Etc. Maps | 1873

01_gilpin_world_1872_transverse 02_gilpin_world_1872

Map of the World. Delineating the Contrasted Longitudinal and Latitudinal forms of the continents: the Isothermal Zodiac and Axis of Intensity Round the World; and the Line of Cosmopolitan Railway and it Longitudinal Feeders.


Herein lie a half dozen very odd yet striking maps, published in William Gilpin’s Mission of the North American People (1873). Gilpin, a quaker from Philadelphia, moved west in the 1830s, joining John C. Frémont on his 1843 expedition. Eventually serving as governor of Colorado, Gilpin was a booster of the American West in general with a vision of boundless future prosperity. His belief in manifest destiny wedded to odd climatological theories, some of which are mapped out here, promoted his vision of the American West.

Maps are from Mission of the North American People, Geographical, Social, and Political. Illustrated by Six Charts Delineating the Physical Architecture and Thermal laws of all the Continents. By William Gilpin, Late Governor of Colorado. Philadelphia: J.B. Lippincott & Co. 1873. Maps can be found at David Rumsey’s site as well as the full text of Gilpin’s book at the Internet Archive.



Map of North America in which are delineated the Mountain System as a Unit, the Great Calcareous Plain and its Details, and the Continuous Encircling Maritime Selvage.


Map of North America. Delineating the “Mountain System” and its details, The “Great Calcarious Plain” as a unit, and the continuous encircling “Maritime Selvage.”


Map Illustrating the System of the Parcs, and the Domestic Relations of the “Great Plains,” the “North American Andes,” and the “Pacific Maritime Front.”


Map of Colorado Territory, and Northern Portion of New Mexico Showing the System of Parcs.


Thermal Map of North America. Delineating the Isothermal Zodiac, the Isothermal Axis of Identity, and its expansions up and down the “Plateau.”

GIS LoungeImproved and More Realistic Minecraft Map of Great Britain

Last year, UK Ordnance Survey (OS) intern Joseph Braybrook created a geographically accurate map of Great Britain for the virtual reality game Minecraft.  That map was created using 22 billion blocks.  Now a full-time employee with the OS, Jopseph has built an improved map built using 83 billion blocks. From the [...]

The post Improved and More Realistic Minecraft Map of Great Britain appeared first on GIS Lounge.

Directions MagazineCaliper Offers Updated UK Data for Use with Maptitude 2014

Directions MagazineOptech announces new airborne lidar sensor

Directions MagazineURISA to Present Certified Workshops Online

The Map Guy(de)MapGuide tidbits: Strange fonts on Linux

Does your MapGuide Linux installation render labels that look like this?

Instead of this?

This is due to MapGuide not being able to correctly locate the required font in question (Verdana). To fix this, you need to install the appropriate font package that provides the missing font. In the case of verdana, that package is msttcorefonts.

On Ubuntu: Simply install the msttcorefonts package

On CentOS: Install the following packages:

  • curl
  • xorg-x11-font-utils
  • fontconfig
And then install the following RPM packages with the rpm command
  • rpm -i
  • rpm -i
Now you can verify your font in question is install by running the fc-list command and looking for the name of your font in the output

fc-list | grep Verdana

Now even after this, MapGuide may still not properly locate this font even if it is installed.

If this is still the case, copy the physical font files to the directory of the mgserver executable (eg. /usr/local/mapguideopensource-2.6.0/server/bin) and restart the MapGuide Server.

In the case of msttcorefonts, the physical font files are found in:
  • On Ubuntu: /usr/share/fonts/truetype/msttcorefonts
  • On CentOS: /usr/share/fonts/msttcore

Directions MagazineTouchShare Announces a Free Trial Offer to its Real-time Geospatial Collaboration Platform

BostonGIS BlogWaiting for PostGIS 2.2 - ST_ClipByBox2D - Map dicing Redux in concert with ST_Tile

One of the new features coming in PostGIS 2.2 is ST_ClipByBox2D (thanks to Sandro Santilli's recent commits funded by CartoDB ). However to take advantage of it, you are going to need your PostGIS compiled with GEOS 3.5+ (very recent build) which has not been released yet. Windows folks, the PostGIS 2.2 9.3 and 9.4 experimental binaries are built with the latest GEOS 3.5 development branch, so you should be able to test this out with Winnie's experimental builds.

Since the dawn of PostGIS, PostGIS users have needed to mutilate their geometries in often painful and horrific ways. Why is ST_ClipByBox2D function useful, because its a much faster way of mutilating your geometries by a rectangular mold than using ST_Intersection. Why would you want to mutilate your geometries? There are many reasons, but I'll give you one: As your geometry approaches the area of your bounding box and as your bounding box size decreases, the more efficient your spatial index becomes. You can consider this article, Map dicing redux of the article I wrote (eons ago) - Map Dicing and other stuff which describes the same approach with much older technology. Though I will be using more or less the same dataset Massachusetts TOWNSSURVEY_POLYM (its newer so you can't really compare) and I tried to simplify my exercise a bit (not resorting to temp tables and such), my focus in this article will be to compare the speed between the new ST_ClipByBox2D approach and the old ST_Intersection approach. The spoiler for those who don't have the patience for this exercise is that using ST_ClipByBox2D at least on my sample data set on my puny Windows 7 64-bit desktop using PostgreSQL 9.4beta2 64-bit was about 4-5 times faster than using ST_Intersection. There was a downside to this speedup. With the ST_Intersection approach, I had no invalid polygons. In the case of ST_ClipByBox2D, I had one invalid polygon. So as noted in the docs, use with caution. We'd be very interested in hearing other people's experiences with it.

One other benefit that ST_ClipByBox2D has over ST_Intersection which I didn't test out is that although ST_Intersection doesn't work with invalid geometries, ST_ClipByBox2D can.

For these exercises, I'm going to also abuse the PostGIS raster function ST_Tile for geometry use, cause for some strange reason, we have no ST_Tile function for PostGIS geometry. For those who missed the improper use of ST_Tile for raster, refer to Waiting for PostGIS 2.1 - ST_Tile.

Continue reading "Waiting for PostGIS 2.2 - ST_ClipByBox2D - Map dicing Redux in concert with ST_Tile"

GIS CloudGIS Cloud at INTERGEO Conference

We are happy to announce that GIS Cloud is attending INTERGEO conference in Berlin from 7th to 9th of October.

NTERGEO is the world’s leading conference trade fair for Geodesy, Geoinformation and Land Management. With over half a million event website users and over 16,000 visitors from 92 countries at INTERGEO itself, it is one of the key platforms for industry dialogue around the world.

Conference celebrates its 20th anniversary by ‘Showcasing the very best Europe has to offer’ which is the key guiding principle of the conference. The defining focus is to highlight Europe’s best examples of the use and role of Geographic Information and Geo-technologies in a variety of thematic fields. Be sure to ping us and see you there!

Spatial Law and PolicyFAA Provides Guidance on Commercial Use of UAVs

Recently, the Federal Aviation Administration (FAA) granted several film and television production companies the right to use UAVs for filming. The FAA’s permission for such use is an exemption to the general prohibition in the U.S. on the use of UAVs for commercial purposes and were issued in response to written requests by these companies pursuant to Section 333 of the FAA Modernization and Reform Act of 2012 (“Section 333”). Section 333 grants the FAA the authority to determine whether certain UAVs can operate in the national airspace subject to certain requirements or conditions. According to published reports there have been approximately 40 such requests made and the FAA is trying to make a decision upon a request within 120 days of receipt.

The FAA letters granting approval are quite detailed (and technical). As a result, I have tried to identify the main requirements below:
  • The UAV must weigh less than 55 pounds (25 Kg), including energy source(s) and equipment.
  • The UAV may not be flown at a speed exceeding a ground speed of 50 knots.
  • Flights must be operated at an altitude of no more than 400 feet above ground level.
  • The UAV must be operated within visual line of sight (VLOS) of the Pilot-In-Command (PIC) at all times. This requires the PIC to be able to use human vision unaided by any device other than corrective lenses.
  • All operations must utilize a visual observer (VO). The VO may be used to satisfy the VLOS requirement, as long as the PIC always maintains VLOS capability. The VO and PIC must be able to communicate verbally at all times.
  • Prior to each flight the PIC must inspect the UAS to ensure it is in a condition for safe flight.
  • The operator must follow the manufacturer’s UAS aircraft/component, maintenance, overhaul, replacement, inspection, and life limit requirements.
  • The PIC must possess at least a private pilot certificate and at least a current third-class medical certificate. The PIC must also meet the flight review requirements for an aircraft in which the PIC is rated on his/her pilot certificate.  In addition, the PIC must have accumulated and logged:
  • a minimum of 200 flight cycles and 25 hours of total time as a UAS rotorcraft pilot and at least ten hours logged as a UAS pilot with a similar UAS type; and,
  • a minimum of five hours as UAS pilot operating the make and model of the UAS to be utilized for operations under the exemption and have conducted three take-offs and three landings in the preceding 90 days.
  • The UAV may not be operated directly over any person, except authorized and consenting personnel, or below an altitude that is hazardous to persons or property on the surface.
  • The operator must ensure that persons are not allowed within 500 feet of the area except those consenting to be involved and necessary.
  • The UAS must abort the flight in the event of unpredicted obstacles or emergencies in accordance with the operator’s manual.
  • Each UAS operation must be completed within 30 minutes flight time or with 25 battery power remaining, whichever occurs first.
  • The UAV must yield the right of way to all other manned operations and activities at all times.
  • The operator must obtain an Air Traffic Organization (ATO) issued Certificate of Waiver or Authorization (COA) prior to conducting any operations under this grant or exemption.
  • UAS operations may not be conducted during night.
  • The UAS cannot be operated by the PIC from any moving device or vehicle.
  • The UAV may not operate in Class B, C, or D airspace without written approval from the FAA.
  • At least three days before flying, the operator of the UAS affected by this exemption must submit a written Plan of Activities to the local Flights Standards District Office (FSDO), to include:
    • A statement that the operator has obtained permission from property owners and/or local officials; and,
    • A description of the flight activity, including maps or diagrams of any area, city, town, county, and/or state over which filming will be conducted and the altitudes essential to accomplish the operation.
            The FAA has taken an important first step in allowing the use of UAVs for commercial purposes in the U.S. The permissions that have been granted are limited, as they only apply to the specific companies that made the requests. However, they are useful in identifying the steps the FAA currently consider to be critical for operating UAVs.  Until the FAA proposes definitive regulations, companies that make a Section 333 request to operate UAVs for commercial purposes in the U.S. should expect to be subject to similar constraints or identify alternatives that address the FAA’s safety and operational concerns. 

All Points BlogTeaching a Flipped GIS Course: Challenges and Opportunities

How does blended learning, flipped classroom concepts and other newfangled education thinking work in GIS? Penn State's Alex Klippel provided some insights yesterday via a webinar. It was part of the University Consortium for Geographic Information Science (UCGIS) series and titled... Continue reading

nyalldawson.netCreating custom colour schemes in PyQGIS

In my last post I explored some of the new colour related features available in QGIS 2.6. At the end of that post I hinted at the possibility of creating QGIS colour schemes using python. Let’s take a look…

We’ll start with something nice and easy – a colour scheme which contains a predefined set of colours (e.g., standard company colours). This is done by subclassing QgsColorScheme and implementing the required methods ‘schemeName‘, ‘fetchColors‘ and ‘clone‘. It’s all fairly self explanatory – most of the important stuff happens in fetchColors, which returns a list of QColor/string pairs. Here’s a sample:

from PyQt4.QtCore import *
from PyQt4.QtGui import *

class QgsCgaLightColorScheme(QgsColorScheme):
    def __init__(self, parent=None): 
    def schemeName(self):
        return "CGA Colors!"
    def fetchColors(self,context='', basecolor=QColor()):
        return [[QColor('#555555'),'Gray'],
                    [QColor('#5555FF'),'Light Blue'],
                    [QColor('#55FF55'),'Light Green'],
                    [QColor('#55FFFF'),'Light Cyan'],
                    [QColor('#FF5555'),'Light Red'],
                    [QColor('#FF55FF'),'Light Magenta'],
    def flags(self):
        return QgsColorScheme.ShowInAllContexts
    def clone(self):
        return QgsCgaLightColorScheme()

cgaScheme = QgsCgaLightColorScheme()

This scheme will now appear in all colour buttons and colour picker dialogs:

CGA colours… what your map was missing!

If you only wanted the scheme to appear in the colour picker dialog, you’d modify the flags method to return QgsColorScheme.ShowInColorDialog instead.

QgsColorSchemes can also utilise a “base colour” when generating their colour list. Here’s a sample colour scheme which generates slightly randomised variations on the base colour. The magic again happens in the fetchColors method, which copies the hue of the base colour and generates random saturation and value components for the returned colours.

from PyQt4.QtCore import *
from PyQt4.QtGui import *
import random

class QgsRandomColorScheme(QgsColorScheme):
    def __init__(self, parent=None): 

    def schemeName(self):
        return "Random colors!"

    def fetchColors(self, context='', basecolor=QColor() ):
        noColors = random.randrange(30)
        minVal = 130;
        maxVal = 255;
        colorList = []
        for i in range(noColors):
            if basecolor.isValid():
                h = basecolor.hue()
                #generate random hue
                h = random.randrange(360);

            s = random.randrange(100,255)
            v = random.randrange(100,255)

            colorList.append( [ QColor.fromHsv( h, s, v), "random color! " + str(i) ] )

        return colorList

    def flags(self):
        return QgsColorScheme.ShowInAllContexts

    def clone(self):
        return QgsRandomColorScheme()

randomScheme = QgsRandomColorScheme()

Here’s the random colour scheme in action… note how the colours are all based loosely around the current red base colour.

Randomised colours

You may also have noticed the context argument for fetchColors. This can be used to tweak the returned colour list depending on the context of the colour picker. Possible values include ‘composer‘, ‘symbology‘, ‘gui‘ or ‘labelling‘.

One final fun example… here’s a colour scheme which grabs its colours using the Colour Lovers API to fetch a random popular palette from the site:

from PyQt4.QtCore import *
from PyQt4.QtGui import *
from xml.etree import ElementTree
import urllib2
import random

class colorLoversScheme(QgsColorScheme):

    def __init__(self, parent=None): 
        xmlurl = ''

        headers = { 'User-Agent' : 'Mozilla/5.0' }
        req = urllib2.Request(xmlurl, None, headers)
        doc = ElementTree.parse(urllib2.urlopen(req)).getroot()

        palettes = doc.findall('palette')
        palette = random.choice(palettes)

        title = palette.find('title').text
        username = palette.find('userName').text
        attrString = title + ' by ' + username
        colors = ['#'+c.text for c in palette.find('colors').findall('hex')]

        self.color_list = [[QColor(c), attrString] for c in colors]

    def schemeName(self):
        return "Color lovers popular palette"

    def fetchColors(self, context='', basecolor=QColor()):
        return self.color_list

    def flags(self):
        return QgsColorScheme.ShowInAllContexts

    def clone(self):
        return colorLoversScheme()

loversScheme = colorLoversScheme()      
QgsColorSchemeRegistry.instance().addColorScheme( loversScheme )

Clicking a colour button will now give us some daily colour scheme inspiration…

Grabbing a palette from the Colours Lovers site

Grabbing a palette from the Colours Lovers site

Ok, now it’s over to all you PyQGIS plugin developers – time to go wild!

LiDAR NewsDocumenting the RRS Discovery

Not only did they accomplish this task, but they also created a very impressive case study on their website. Continue reading →

Click Title to Continue Reading...

All Points BlogWhere Location Analytics Can Save Corporations from Drowning in Big Data - HBR

In a blog post in the Harvard Business Review, Brian McCarthy, a managing director for Accenture, cautions corporations about drowning in "big data" overload that threatens their ability to prioritize and solve problems.The examples that McCarthy provides illustrate how location-based... Continue reading

Between the PolesFERC Order 745 vacated: negawatt does not equal a megawatt

Demand response programs involve paying power consumers not to use power at certain times.  Reducing peak load can save consumers money by avoiding the use of or even construction of "peakers", generation facilities that are very expensive to run and that are only used at times of peak load.  If I remember correctly, something like 15% of US generation capacity is only used 5% of the time.  Demand response programs are very common in parts of Europe including Switzerland and Germany where they have been in wide use for at least ten years to reduce peak load and smooth out the diurnal load curve.

FERC logoIn March 2011, the Federal Energy Regulatory Commission (FERC) in Order No. 745, ruled that demand response, paying for reducing power demand, must be compensated at the same rate as power generation, all other things being equal.  In other words, compensation for saving a megawatt-hr (negawatt) is the same as for generating a megawatt-hr.

In May 2014, the U.S. Court of Appeals in Washington, D.C. vacated FERC’s Order 745 stating that FERC overstepped its jurisdiction and that the decision about the compensation demand response providers  must receive should lie with the states.  This means that a negawatt does not necessarily equal a megawatt.

GIS LoungeMeasuring Small Variations in the Earth’s Gravity

Scientists can create higher resolution maps showing small variations in the earth’s gravity gradient by combining data from two different satellites. The NASA–German Grace satellites were launched in May of 2002 in order to map variations in Earth’s gravity field.  The two satellites fly about 220 kilometers (137 miles) apart in a polar [...]

The post Measuring Small Variations in the Earth’s Gravity appeared first on GIS Lounge.

GIS LoungeMeasuring Carbon Dioxide and Nitrogen Dioxide Trends with Remote Sensing

Most measurements of greenhouse gases uses a bottoms-up approach by estimating emissions based on reported fossil fuel consumptions from power plants and other sources.  Researchers from the University of Bremen recently published in Nature Geoscience the results of an effort to implement a top-down approach using data acquired remotely. Data was [...]

The post Measuring Carbon Dioxide and Nitrogen Dioxide Trends with Remote Sensing appeared first on GIS Lounge.

It's All About Data6 Examples from FME Cloud Early Adopters [w/Videos]

FME Cloud has been in public beta for close to a year now. Having spent a lot of time building FME Cloud with our incredible developers, I get extremely excited to see the new and innovative ways it’s being used by our customers.

Below are just a few of the things they are doing with FME Cloud.

1. – A Cloud-based Data Quality and Transformation Platform

geopolOur good friends at INSER have built, a web service that lets people perform a wide variety of data quality and transformations tasks online. Simply select the task you want, upload your data, and download the result. As Jean-Luc Miserez says, the main goal of the system is to “make complicated processes easily accessible”. And they’ve done that.

Stepping back a bit, what I find especially exciting here (and it’s a trend we’re seeing more often) is that FME Cloud is being used as a platform to create new web services. In this case, the end user for doesn’t see or even know anything about FME Cloud. It’s merely being used to make the higher service possible.

Get the slides from this presentation.

2. WhiteStar Cloud – Automated Delivery of Oil & Gas Data with FME Cloud

white-star-cloudData delivery is one of the hallmarks of FME Server, and so it is no surprise that this is also one of the biggest uses of FME Cloud.

One such example comes from WhiteStar, which developed WhiteStar Cloud to supply data to the energy industry. WhiteStar’s system lets its clients choose the geography, data layers, and coordinate system they want to access via a map, and then deliver that data to cloud-based systems such as ArcGIS Online, or as a downloadable file format.

If you ever find that either data endpoint is in the cloud, then it makes sense to use the FME Cloud deployment of traditional FME Server.

Get the slides from this presentation.

3. Sterling Geo – Provide Mobile Access to Ordnance Survey Data to Surveyors

sterling-geo-fme-cloudIn this example, Sterling Geo needed to help a customer complete a condition survey of wooden poles within six months to meet regulatory requirements. They used FME Cloud to manage the data on the back-end and support surveyors in the field by providing them with licensed Ordnance Survey mapping data via a web browser and mobile apps.

Prototypes and Proof-of-Concepts

This project was a one time thing. After the project was completed, it didn’t live on. It’s here where the “disposable hardware” aspect of FME Cloud makes perfect sense. Pay only for what you need.

It also illustrates why FME Cloud is ideal for prototyping new solutions:

  • Easy approval. IT isn’t concerned, as FME Cloud sits outside the firewall.
  • Secure. System is not exposing any corporate systems to the internet.
  • Easy migration. If successful, it can be easily migrated inside the corporate firewall using FME Server’s migration capabilities.
  • Reduced risk. Pay-as-you-go pricing makes it inexpensive to experiment with.  Plus, no hardware to buy.


Get the slides from this presentation.

4. Grafton Technologies - Creating Safer Airports

randy-murphyRandy Murphy of Grafton Technologies won our FME Cloud launch contest for this impressive use case. His system helps airports prevent runway incursions, taxiway collisions, and identify gaps in safety regulations.

He uses FME Cloud to connect many systems, including ArcGIS Online and Google Earth, and to perform complex data transformations to provide an integrated view of the airport. This view helps airports ensure that they have correct signage and to identify any obstacles that need to be removed to meet safety regulations. Again, like with many projects involving FME Cloud, the end user doesn’t see or need to know anything about FME Cloud. It just works.

Get the slides from this presentation.

5. Alpine Shire Council – Providing Bushfire Assessments in the Field

Barrett's view from the Alpine Shire Council offices as the 2006 bushfire threatened the local mountainside.

Mobile applications are popping up everywhere. If your mobile app has a need to leverage spatial data then it is hard to imagine a more powerful or cost effective back-end platform than FME Cloud.

Our very first FME Cloud user, Barrett Higman, did just that with his award-winning BAL Plan app. BAL (Bushfire Attack Level) measurements tell landowners how susceptible their building site is to bushfires – which are all too common in Australia – and whether they are able to obtain a permit to build. With Barrett’s app, assessors can perform this analysis in the field within seconds by using an iPad’s onboard GPS and FME Cloud to do the back-end heavy lifting.

 Read the blog post with more details on this story.

6. Safe Software – Yes, We Use FME Cloud Too!

Safe_RGBIf you don’t mind the expression, we like to eat our own dog food. We already use FME Cloud to sync data between our website and back-end data sources. Our IT team uses FME for many things and aims to use FME Cloud to improve integration between internal cloud systems, like Salesforce CRM, for improved corporate reporting.

Of course, we’ve also used it for more  lighthearted projects like powering the quiz for the FME World Tour 2014 and the recent FME World Cup of Data 2014. For the latter, we used FME Cloud to power a soccer-themed format contest. It managed the contest submissions, updating the website with match results and even tweeting the scores for each game. Below, De Wet gets into more detail on why he used FME Cloud for the contest.

$250 Free Credit for FME Cloud

We give everyone that signs up for FME Cloud a $250 credit.  What can you do with a $250 credit?

  1. You can run a starter FME Server for about 11 days straight 24/7.
  2. You can run a starter FME Server for almost 7 weeks at 40 hours per week.
  3. You can start and stop your server instance as needed and go a lot longer.

Aside from a very small data storage charge, with FME Cloud  you only pay for the hours your server is running.

Try it yourself by clicking here and you too can experience all that FME Cloud has to offer.

If you are using FME Cloud please tell us how you are using it!  We are always excited to engage with you. One way to do just that is to join Stewart Harper and I on October 2nd for a live Q&A on FME Cloud.

The post 6 Examples from FME Cloud Early Adopters [w/Videos] appeared first on Safe Software Blog.

nyalldawson.netWhat’s new in QGIS 2.6 – Tons of colour improvements!

With one month left before the release of QGIS 2.6, it’s time to dive into some of the new features it will bring… starting with colours.

Working with colours is a huge part of cartography. In QGIS 2.4 I made a few changes to improve interaction with colours. These included the ability to copy and paste colours by right clicking on a colour button, and dragging-and-dropping colours between buttons. However, this was just the beginning of the awesomeness awaiting colours in QGIS 2.6… so let’s dive in!

Part 1 – New colour picker dialog

While sometimes it’s best to stick with an operating system’s native dialog boxes, colour pickers are one exception to this. That’s because most native colours pickers are woefully inadequate, and are missing a bunch of features which make working with colours much easier. So, in QGIS 2.6, we’ve taken the step of rolling out our very own colour picker:

New QGIS colour picker

Before starting work on this, I conducted a review of a number of existing colour picker implementations to find out what works and what doesn’t. Then, I shamelessly modelled this new dialog off the best bits of all of these! (GIMP users will find the new dialog especially familiar – that’s no coincidence, it’s a testament to how well crafted GIMP’s colour picker is.)

The new QGIS colour picker features:

  • Colour sliders and spin boxes for Hue, Saturation, Value, Red, Green and Blue colour components
  • An opacity slider (no more guessing what level of transparency “189” corresponds to!)
  • A text entry box which accepts hex colours, colour names and CSS rgb(#,#,#) type colours. (The drop down arrow you can see on this box in the screenshot above allows you to specify the display format for colours, with options like #RRGGBB and #RRGGBBAA)
  • A grid of colour swatches for storing custom colours
  • A visual preview of the new colour compared to the previous colour
  • Support for dragging and dropping colours into and out of the dialog
  • A colour wheel and triangle method for tweaking colours (by the way, all these colour widgets are reusable, so you can easily dump them into your PyQGIS plugins)
    Colour wheel widget
  • A colour palettes tab. This tab supports adding and removing colours from a palette, creating new palettes and importing and exporting colours from a GPL palette file. (We’ll explore colour palettes in more detail later in this post.)
    Colour palettes
  • A colour sampler! This tab allows you to sample a colour from under the mouse pointer. Just click the “Sample color” button, and then click anywhere on the screen (or press the space bar if you’re sampling outside of the QGIS window). You even get the choice of averaging the colour sample over a range of pixels. (Note that support for sampling is operating system dependant, and currently it is not available under OSX.)
    Built in colour sampler! Woohoo!

Part 2 – New colour button menus

Just like the new colour dialog is heavily based off other colour dialog implementations, this new feature is inspired by Microsoft’s excellent colour buttons in their recent Office versions (I make no claim to originality here!). Now, all QGIS colour buttons come with a handy drop down menu which allows you to quickly choose from some frequently used colour shortcuts. You’ve got the previously available options of copying and pasting colours from 2.4, plus handy swatches for recently used colours and for other standard colours.


Handy colour menu for buttons

Part 3 – Colour palettes

You may have noticed in the above screenshot the “Standard colors” swatches, and wondered what these were all about.  Well, QGIS 2.6 has extensive support for color palettes. There’s a few different “built-in” color palettes:

  • The “Standard colors” palette. This palette can be modified through the Options → Colors tab. You can add, remove, edit, and rename colours, as well as import color schemes from a GPL palette file. These standard colours apply to your QGIS installation, so they’ll be available regardless of what project you’re currently working on.

    Customising the standard colours

    Customising the standard QGIS colours

  • The “Project colors” palette. This can be accessed via the Project Properties → Default styles tab. This palette is saved inside the .qgs project file, so it’s handy for setting up a project-specific colour scheme.
  • The “Recent colors” palette. This simply shows colours you’ve recently used within QGIS.

You can easily create new colour palettes directly from the colour picker dialog. Behind the scenes, these palettes are saved into your .qgis/palettes folder as standard GPL palette files, which makes it nice and easy to modify them in other apps or transfer them between installations. It’s also possible to just dump a stack of quality palettes directly into this folder and they’ll be available from within QGIS.

Perhaps the best bit about colour schemes in QGIS is that they can be created using PyQGIS plugins, which opens up tons of creative possibilities… More on this in a future blog post!

So there we go. Tons of improvements for working with colours are heading your way in QGIS 2.6, which is due out on the 24th October.

(Before we end, let’s take a quick look at what the competition offers over in MapInfo land. Yeah… no thanks. You might want to invest some development time there Pitney Bowes!)

Ed ParsonsBeyond the Selfie Stick..

Having spent the last few days in the tourist hotspot of Florence, it’s clear that the mobile gadget of the year is the selfie stick, a telescopic stick with a mount for your smartphone to take the perfect selfie beyond arms length.

It’s seems that you have not visited somewhere or taken part in any activity without the accompanying selfie  and it’s only a matter of time before Samsung and Apple start putting more effort in the front facing camera that they do with the traditional one.

The ultimate selfie however may come from a relative of this prototype.. the Nixie.

The Nixie is a personal photography drone,  and finalist in the Intel wearable technology contest, and is brilliant in two ways..  As an example of user centered product design and secondly as a insight into the potential legal and privacy concerns that will come when micro UAV’s become our personal companions.

I’ve already played with this cheap toy, much to the annoyance of the rest of my family.. imagine in the near future when the streets of Florence are swarming with them !

VerySpatialA VerySpatial Podcast – Episode 480

A VerySpatial Podcast
Shownotes – Episode 480
28 September 2014

Main Topic: Darren Platakis, Founder of Day Of Geography

  • Click to directly download MP3
  • Click to directly download AAC
  • Click for the detailed shownotes


  • Keep It Clean by Mike McGill

  • News

  • Xprize $15 million Global Learning Challenge
  • Availability of wordwide 30m SRTM elevation data announced by Obama at UN Climate SummitAfrica data set now out
  • IKONOS turns 15
  • Indian spacecraft achieves Mars orbit, MAVEN enters Mars orbit
  • NGA on Web Mercator from July and Esri’s recent comment
  • ArcGIS Online update

  • Web Corner

  • GeoFort

  • Main topic

  • This week we talk to Darren Platakis of Geospatial Niagara, founder of Day of Geography, as we discuss his new Day of Geography initiative.

  • Tip of the Week

  • Earth Science Week 2014

  • Events Corner

  • 10th International gvSIG Conference: 3-5 December, Valencia, Spain
  • GeoComputation 2015: 20-23 May, Dallas – Abstracts due January 15th
  • Esri Education GIS Conference:18-21 July, Abstracts due October 31st
  • LiDAR NewsCyArk 500 Annual Summit 2014 is Next Week

    This important cultural heritage event takes place October 7 and 8 at the National Archives in Washington, D.C. Continue reading →

    Click Title to Continue Reading...

    Technical RamblingsSocial Networks and Business Plans

    Like everyone on the internet, I’ve seen a lot about Ello in the last week or so. While I’m not convinced Ello is the next big thing, more recently, there have been articles about how Ello must be planning to sell you out, because their proposed business model can never work, and all Venture Capitalists require an exit strategy. Regardless of how true the latter may be, I am not convinced the former is true at all.

    My initial forays into the online world were based on GeoCities and Tripod, like many other people of my generation. In my transition to college, LiveJournal became my home on the internet. It was my first work with an open source project. It was where I made friends, and it was even the website where I met my wife. It was also a website which was run, for years, based on a funding model which was entirely ad-free, at a time when banner ads were the way of the internet.

    When the website started, in the early 2000s, “No ads, ever” was the mantra of the site (like Ello). The site was originally invite-based, so that growth was somewhat limited (like Ello). The site didn’t collect and sell your information to advertisers (like Ello). The site was funded by users who paid for additional features (like Ello); for LiveJournal, features included things like more user pictures, the ability to make posts by making a phone call, domain forwarding, advanced customization options for look and feel.

    LiveJournal functioned as a business this way for a number of years; from at least 2002 - 2005, when it was bought by Six Apart, LiveJournal seemed like a functional business from the outside. It was a small business run by a small number of employees and supported by a dedicated volunteer base who worked to run areas like user support. There was enough of a business here to result in a sale to Six Apart in 2005; while no details of the deal were ever published, it seems reasonable to assume that it was considered to be a viable business at the time of the sale.

    Now, LiveJournal was never started to be a business. It was started as a way for the creator to keep in touch with his friends. It was run as a semi-business, but as with many things started by people in their idealistic years during and shortly after college, sometimes they lose the ability to maintain the dedicated interest necessary to keep them going. (See also: Most of my early software.) After the sale to Six Apart, the “No ads on LiveJournal” policy slipped somewhat, and a number of social shifts caused a bit of a fall from grace in the somewhat utopian ideals that LiveJournal had. (Not the least among them that ads probably became significantly more profitable and effective…)

    But LiveJournal isn’t the only social network that had this policy. After LiveJournal’s sale to SUP, some of the volunteers from LiveJournal decided that the things LiveJournal stood for were good, and that the system it had was workable, but it needed a bit more realistic business approach, and started Dreamwidth; like LiveJournal, the site is funded through people who purchase additional features for their accounts, rather than advertising. (One of the site’s Guiding Principles is “We won’t accept or display third-party advertising on our service, whether text-based or banner ads. We are personally and ideologically against displaying advertising on a community-based service.”)

    Dreamwidth was founded in 2008, and opened to the public in 2009; it started with invite codes and later was able to move away from them. The site has more than 2M registered accounts, and although it’s not going to be the next Facebook, it’s probably reasonable to assume that the site isn’t losing money hand over fist. (It has been around for 5 years, and shows no signs of unhealthiness that I can see from the outside, though I have no inside knowledge.) It is funded by people who purchase additional features for their accounts.

    The idea of free accounts being paid for by people who want additional features is not new. The claim from some that “…no one has ever tried it as a central business model, at least not in social” is clearly false. Some people have tried it. It has even, to some extent, been successful. And although it may be that Ello is not planning to do what they say they’re going to is possible, it seems entirely more likely that Ello is trying to follow in the footsteps of those who have gone before it and created social networks that millions called home in the earlier days of the internet.

    If what Ello wants to make is a “sustainable business”, as they’ve claimed, then there is no reason to think that they can’t do it by following exactly the funding model they have proposed. I hope all goes well for them, and that they’re able to hold onto those principles. If they’re not, and you’re still looking for that ad-free, friendly environment that you miss from the earlier internet… there’s always the comfort of Dreamwidth.

    'sprokeLoading JSON-LD Into Elasticsearch

    From the elasticsearch mailing list

    Amine Bouayad amine@***.com via 

    Thank you all for your responses and interesting conversation about RDF serialization into ES. With regards to my original post, I ended up using a solution based on RDFlib: 

    It works as expected, and compacting the content by using @context does the trick and is flexible. It is an in-memory process however, which could be an issue for those with very large RDF files. When using Jena, I didn't find the ability to add @context mappings, but maybe I didn't dig enough.

    On a side note, looks like the rdflib-jsonld solution already has support for XSD literals and lists, so perhaps it could be extended to map directly into ES _type if that is a good direction.

    With my Json-ld file ready for ingestion into ES, I do have another question: are there utilities to bulk load such documents (the json-ld contains individual documents per ES, each with an _id), or do I just write a script that calls curl -XPUT for each record in the json-ld file? Seems like a pretty common use case.

    Thanks again to all, interesting stuff. Happy to contribute to extending an existing solution.


    LiDAR NewsStudents Learn About 3D Scanning and Printing

    This is a very encouraging use of 3D scanning and printing to get young students excited about paleontology. Continue reading →

    Click Title to Continue Reading...

    GIS LoungeNational Geospatial-Intelligence Agency’s Web Mercator Advisory Notice

    The National Geospatial-Intelligence Agency (NGA) has an advisory notice posted on its site warning about errors users of the popular Web Mercator projection stating, “the use of Web Mercator and other non-WGS 84 spatial reference systems may cause geo-location / geo-coordinate errors up to 40,000 meters.”  The NGA further advised, [...]

    The post National Geospatial-Intelligence Agency’s Web Mercator Advisory Notice appeared first on GIS Lounge.

    Andrew Zolnai BlogWeb maps on steroids

    The last 6 blog posts over the last 3 months chronicled the use of dynamic maps using time attribute - years for historic ship tracks and wind data from same 150-350 yrs ago - not to animate maps but to filter them by decade and manage data fetches on ArcGIS Online. A parallel series of posts showed mega data sets on Amazon Web Services, as Mapcentia assured me postGIS handled giga datasets...

    click to enlarge or go to web map direct (select Layer > Direction Force)

    Well this turned out to be quite true, as detailed on map catalog blog: almost half a million points  totalling almost a gigabyte of data post at a blistering speed... faster than the desktop in fact! This is  due to tile caching in Leaflet on the Mapcentia GeoCloud stack reading AmazonWeb Services data on postGIS.

    Why post so may points, one may ask? Because CLIWOC chronicles over 180 weather and shipping attributes almost daily along sailings over 100 years (plus a sample 100 years older) from British, Dutch, French, and Spanish maritime agencies! This resulted in over 230,000 points that doubled to 470,000 when multilingual lookup tables were joined to make wind force and direction computable attributes. I queried this on my blog too, in case I lost my way somewhere... In fact my friend Hussein Nassr, who wrote two books on ArcGIS Server and Geodatabases, pointed out that:
    It is not recommended to use relational model for such huge data. This is the era of Big data, in technical words, De-normalize, minimize the number of tables, eliminate unnecessary look ups and domains, it is ok to have duplicated data, we don't have a memory or space problem anymore.
    So I did just that, and posted  a redux version of same that maintained the features but cleaned the attributes, and resulted in a 181Mb file geodatabase (33Mb compressed). These can be found on ArcGIS Online and ShareGeo Open as shapes.
    And lest we forget, this is just the geo part - climatologists and historians who know this data are in the process of collecting historic weather data, to study climate change history in multidisciplinary and multinational efforts such as ACRE...

    'sprokeello protip: mp4 to animated gif using ffmpeg

    Ello doesn't support videos yet, so animated gifs are the way to go. If you have brew installed you can just install ffmpeg:

    ~ brew install ffmpeg

    To convert a video to gif with ffmpeg:

    ~ ffmpeg -i myvideo.mp4 -vf scale=320:-1 -t 10 -r 10 myvideo.gif

    -t sets the time of the video
    -r sets the number of frames per second

    And there are bunch of other parameters:

    Global options (affect whole program instead of just one file:
    -loglevel loglevel  set logging level
    -v loglevel         set logging level
    -report             generate a report
    -max_alloc bytes    set maximum size of a single allocated block
    -y                  overwrite output files
    -n                  never overwrite output files
    -stats              print progress report during encoding
    -max_error_rate ratio of errors (0.0: no errors, 1.0: 100% error  maximum error rate
    -bits_per_raw_sample number  set the number of bits per raw sample
    -vol volume         change audio volume (256=normal)

    Per-file main options:
    -f fmt              force format
    -c codec            codec name
    -codec codec        codec name
    -pre preset         preset name
    -map_metadata outfile[,metadata]:infile[,metadata]  set metadata information of outfile from infile
    -t duration         record or transcode "duration" seconds of audio/video
    -to time_stop       record or transcode stop time
    -fs limit_size      set the limit file size in bytes
    -ss time_off        set the start time offset
    -timestamp time     set the recording timestamp ('now' to set the current time)
    -metadata string=string  add metadata
    -target type        specify target file type ("vcd", "svcd", "dvd", "dv", "dv50", "pal-vcd", "ntsc-svcd", ...)
    -apad               audio pad
    -frames number      set the number of frames to record
    -filter filter_graph  set stream filtergraph
    -filter_script filename  read stream filtergraph description from a file
    -reinit_filter      reinit filtergraph on input parameter changes

    Video options:
    -vframes number     set the number of video frames to record
    -r rate             set frame rate (Hz value, fraction or abbreviation)
    -s size             set frame size (WxH or abbreviation)
    -aspect aspect      set aspect ratio (4:3, 16:9 or 1.3333, 1.7777)
    -bits_per_raw_sample number  set the number of bits per raw sample
    -vn                 disable video
    -vcodec codec       force video codec ('copy' to copy stream)
    -timecode hh:mm:ss[:;.]ff  set initial TimeCode value.
    -pass n             select the pass number (1 to 3)
    -vf filter_graph    set video filters
    -b bitrate          video bitrate (please use -b:v)
    -dn                 disable data

    Audio options:
    -aframes number     set the number of audio frames to record
    -aq quality         set audio quality (codec-specific)
    -ar rate            set audio sampling rate (in Hz)
    -ac channels        set number of audio channels
    -an                 disable audio
    -acodec codec       force audio codec ('copy' to copy stream)
    -vol volume         change audio volume (256=normal)
    -af filter_graph    set audio filters

    Subtitle options:
    -s size             set frame size (WxH or abbreviation)
    -sn                 disable subtitle
    -scodec codec       force subtitle codec ('copy' to copy stream)
    -stag fourcc/tag    force subtitle tag/fourcc
    -fix_sub_duration   fix subtitles duration
    -canvas_size size   set canvas size (WxH or abbreviation)

    -spre preset        set the subtitle options to the indicated preset

    LiDAR NewsDoubling Up on Mobile Mapping

    That's a major investment by a world leader in the field - encouraging news for the industry. Continue reading →

    Click Title to Continue Reading...

    GIS LoungeI’m a…. The GIS Profession in One Word

    When I think about the debate about whether or not GIS is a profession, one aspect that pokes at me is the inability to succinctly sum up the answer to the question, “What do you do?” in one word.  When I think about professions, I think about the one word [...]

    The post I’m a…. The GIS Profession in One Word appeared first on GIS Lounge.

    LiDAR NewsGamer Using Point Clouds

    Storing data as a point cloud allows information to be streamed off conventional hard drives. Continue reading →

    Click Title to Continue Reading...

    All Points BlogGIS Health News Weekly: Alcohol-related Liver Disease in England, Vaccines in Hollywood, Epidemic MOOC

    Liver Disease Health officials have mapped out the places in England which have the highest rates of people admitted to hospital as an emergency for alcohol-related liver disease. The North West and the North East were pinpointed to be the places with the highest hospital... Continue reading

    PostGIS DevelopmentGetting distinct pixel values and pixel value counts of a raster

    PostGIS raster has so so many functions and probably at least 10 ways of doing something some much much slower than others. Suppose you have a raster, or you have a raster area of interest — say elevation raster for example, and you want to know the distinct pixel values in the area. The temptation is to reach for ST_Value function in raster, but there is a much much more efficient function to use, and that is the ST_ValueCount function.

    ST_ValueCount function is one of many statistical raster functions available with PostGIS 2.0+. It is a set returning function that returns 2 values for each row: a pixel value (value), and a count of pixels (count) in the raster that have that value. It also has variants that allow you to filter for certain pixel values.

    This tip was prompted by the question on stackexchange How can I extract all distinct values from a PostGIS Raster?

    Continue Reading by clicking title hyperlink ..