Attention Neotards: Your GPS Sucks!

Reading through an update from Open Street Map (incredibly, it has taken them mere months to load TIGER into their futuristic system) I came across this little gem regarding aerial photography:

Since the TIGER map data was produced from aerial photography, and was originally intended to assist Census Bureau officials in the field, such problems [misalignment of roads] are bound to occur and are unlikely to have undergone official correction.

I’m not sure what to mock first, the leap to the conclusion that TIGER data problems are a result of aerial photography, or the related conclusion that GPS tracing is somehow superior to aerial photography. They are of course, closely related in the mind of the neotard: “Hmm, TIGER is old; aerial photography is old; TIGER sucks; aerial photography is old and expensive (so it sucks); therefore, the reason TIGER sucks is because aerial photography sucks!”

(Admission, I don’t know why TIGER positions are so bad in places. The answer may well be the source, but it cannot be hung off of aerial photography. Much of TIGER is sourced from lower levels of government then stitched in, so it is probably a pastiche of capture methods. I wouldn’t be surprised if some of it was digitized off of paper road maps. Or if some of it has not been positionally updated in 25 years.)

Paleotard

Oh, pity the poor paleotards, who don’t know any better, wasting good money flying about taking error prone “photographs”, instead of doing the smart thing and walking around with a Garmin. (Is that a Garmin in your pocket, or are you just happy to see me?)

I admit, I suffered from “GPS is magic” syndrome for quite a while, but I had the fortune to be exposed to people whose job it is to make base maps, who have to ensure that the lines they place on the map (in the database) are as close to “true” location as possible, given the technology available. That exposure taught me some useful things about source data collection, and one thing it taught me is that GPS traces are extremely hard to quality control.

The GPS has a very hard job to do. It has to read in multiple signals from satellites and calculate its location based on very, very, very small time differences. What happens when the signals intermittently drop because of trees overhead blocking the signal? Or bounce off of a nearby structure? The unit makes errors. Which would be fine, except it’s hard to isolate which observations are good, and which ones are bad. Too often, GPS track gathering falls back on heuristics that delete “zingers” (really obviously bad readings, determined visually or with a simple algorithm) and assume every other observation is “good”. If you delete zingers and take enough repeated traces, you can slowly get averaged lines that converge towards meter-level accuracy. However, the need for multiple traces radically increases the manpower/cost of gathering decent data, and the accuracy level does max out after a while.

The answer to getting good positional information over a large area is to tie together the strengths of GPS systems and the strength of remote sensing (aerial, satellite) systems.

  • Take a picture from above (or better, borrow one, from the USDA, or USGS, that has already been “orthocorrected”). This provides a very good source of relative measurement information: you can determine very precisely the distance and bearing between any points A and B. But it has no absolute positioning: how do I know the longitude/latitude of A and B?
  • Find several points at the edges of your photograph that are clearly visible from above and identifiable from the ground. Take your GPS to those points, and take long, long, long collections of readings (a hour is good) at each one. Take those readings home, and post-process them to remove even more error. Average your corrected readings for each point, you now have “ground control points”.
  • Use those control points to fit your aerial picture to the ground in some nice planar projection (like UTM). Digitize all the rest of your locations directly off the picture.

Take a deep breath. You are now a paleotard, but a happy, contented paleotard with very well located data.

This posting derives from a very interesting discussion on Geowanking from some months back. The question was “how do I map my two acre property at very high accuracy?”, and while the initial guess of the poster involved using a variation on GPS tracing, the best final answer was a hybrid solution.

Wheels Within Wheels

Sean says he supported Obama because he is more “electable “. So, what are we to make of this notice in the New York Times?

Senator John McCain’s presidential campaign said Thursday that it stood by a year-old pledge made with Senator Barack Obama that each would accept public financing for the general election if the nominee of the opposing party did the same.

This isn’t about campaign financing, since that issue would keep until after the Democratic nominee is decided. This is about pumping up Obama by treating him like the presumptive nominee, putting let more wind in his sails heading into Texas and Ohio, where an Obama win will put a stake through the heart of the Clinton campaign. Could it be the McCain campaign doesn’t share Sean’s take on who the more electable Democrat is?

Keep your Stinking Rasters Out of My Database

W00t! Yeah! Sing it brother!

http://spatialgalaxy.net/2008/02/15/rasters-in-the-database-why-bother/

Timmy's Telethon #4

Out of left field:

  1. Compatibility: This problem reaches across the board but when it comes to open source vs. closed source; from what you’ve seen is it a wash? I must admit that I’m inclined to stick with the devil that I and everyone else knows. Additionally doesn’t the nature of open source introduce opportunities for proprietary stovepipes?

This one I frankly do not understand, but perhaps it is in the nature of the word “compatibility”.

In general, open source software is wildly more interoperable and therefore “compatible” with different proprietary stovepipes than the proprietary alternatives. Stovepipes? Mapserver, for example, can pull data out of dozens of file formats, as well as SDE, Oracle, geodatabase, and the usual open source suspects like PostGIS. Because open source development priorities are “scratch the itch” and people in real offices need to do real work, one of the first features requested and funded is almost always “connect to my proprietary database X”.

(This is not just about Mapserver either: Geoserver, uDig, gvSIG, QGIS, even good old GRASS all have better multi-format connectivity than leading proprietary brands. Note the word “leading”… the non-leading proprietary brands tend to have good connectivity too, but the market leader uses lack of interoperability as a means to protect their lead.)

If “compatibility” is really a synonym for “does it work with ArcMap”, then indeed there is a problem, but it is not on the open source side. ESRI ties ArcMap tightly to their own stovepipe for very good (to them) reasons of competitiveness and market protection. ArcMap sells ArcSDE licenses, not vice versa. (How many times have you heard someone say “this SDE stuff is great! if only I had a mapping tool to work with the data…”)

I’m open to suggestions as to what part of open source “nature” actually lends itself to proprietary stovepipes. That part I don’t get at all.

Timmy's Telethon #3

Neither rain nor sleet nor dark of night will stop me:

  1. Sociology / Psychology: There is so much more to pitching a solution to those who control the purse strings than technical logic. Selling open source to the powers that be is… well it’s tough. Are there consultants that can be hired to answer the hard questions and make the decision makers feel comfortable? I would love to see more real world business cases for comprehensive GIS environments that must cater to diverse requirements.

25 years ago, selling anything that wasn’t IBM to the powers-that-be was tough. I expect selling open source to remain tough, but get easier and easier over time as people with Internet-centric mindsets percolate up into decision making positions for larger organizations.

Sure, there are consultants, as in any field. Real-world business cases are available, but few and far between. When you ask people to write their own case studies, generally the folks with interesting studies are too busy to do it, so you end up with a bunch of academic stuff. And going out to gather the things is a lot of work. And often some of the most compelling studies are found by accident!

I compile case studies for PostGIS, and every 12 months I ask on the postgis-users mailing list (1500 subscribers) for folks to volunteer for case studies. I’ll do the writing and editing, they just need to talk to me on the phone or on email. I don’t get many responses!

When I was at FOSS4G 2006 two years ago, it was mentioned in passing to me that “oh yes, IGN is using PostGIS now”. So I asked for a contact name, tracked the guy down, and extracted the story from him. It’s a great case study! But getting it required a curious combination of persistence and luck. Same with the North Dakota study, which I got via friend-of-a-friend referral.

The request for information about “comprehensive GIS environments” feels like trolling for someone to say “just drop ESRI”! And that would be silly. Open source has lots of solutions that do things that ESRI products do, but there is no comprehensive drop-in replacement solution. Even if there were, the switching costs alone would probably make in uneconomic.

So here is what to say to the powers-that-be. Don’t foam at the mouth, don’t wear jams and a backwards baseball cap. Recognize that change is slow, and there are sound financial reasons for that:

For an organization just getting started with open source, it provides advantages at the margins: not in reworking your existing systems, but giving you flexible options when building new ones. The existing systems should be left running until they hit a natural end-of-life, either when they become out of date, or so expensive to run/pay maintenance on that the switching cost actually becomes acceptable. Evaluate the cost of change regularly. Sometimes not changing is the more expensive option, and it is important to know what that time arrives.