Timmy's Telethon #5

It’s not all about Timmy, Timmy worries about the little people too!

  1. Business model: I think I’m missing the business opportunity behind open source. Perhaps its altruism or a hobby… but that only goes so far. How are people leveraging this to pay the bills? Are services where the money is at?

Do you litter? When you put your garbage in a can, instead of dropping it on the street, is it altruism, or self-interest? After all, you’re keeping your own environment clean along with everyone else’s. What about the fellow who actually bends over and picks up someone else’s empty coffee cup to put into the can?

Open source turns software into a commons, like the environment, but because we are used to thinking of software as property, our minds find it hard to figure out why open source software aggregates effort and gets cleaner, faster, stronger. Unlike picking up litter, which everyone can do, improving open source can only be done by a small percentage of the population who can understand and work on the code, and that makes it even harder for the rest of the population to understand what is going on. WTF are these lunatics up to, moving bits of used paper into cylindrical holding devices?

The people improving open source fall into lots of categories, but here are some broad ones:

  • The altruist / tinkerer. The most popular media archetype, the altruist / tinkerer probably accounts for the smallest amount of effort on mature open source technologies, but occasionally will create a new technology from scratch that is so good and compelling that it gathers other types who then keep it alive and move it into wider utility.
  • The service provider. The most widely understood business model, running from the one man band (take a bow, Frank Warmerdam) to billion-dollar companies like Red Hat. They will sell you support, or custom development, directly on the software. Because they are tightly associated with the software, this is the easiest model for “vendor minded” folks to mentally grasp.
  • The systems integrator. Working with the open source software, but not necessarily on the software, the system integrator is easily drawn into fixing bugs and adding new functionality on projects as the client requires. Systems integrators love open source because it allows them to meet client needs without being stuck behind a vendor’s development priorities (“we’ve got your bug report on file for the next release”, “that feature will be available in 18 months”).
  • The company man. Easily the least appreciated member of the open source pantheon, because he is not paid to work on open source. He is paid to work on fish inventories. Or carbon models. Or inventory management. Or tax collection. But he has a bit of discretionary time, and he uses it to make the tools he works with work better.

Notably missing from this list is “the billionaire” and “the venture capitalist”. That is because, while there is money to be made in open source, there is not a lot of money to be made. People who try to put up fences in the open source commons find that all the rest of the players end up routing around them, and the value slowly drains from their little patch of land, leaving them only with the tried and true proprietary model to fall back on.

Attention Neotards: Your GPS Sucks!

Reading through an update from Open Street Map (incredibly, it has taken them mere months to load TIGER into their futuristic system) I came across this little gem regarding aerial photography:

Since the TIGER map data was produced from aerial photography, and was originally intended to assist Census Bureau officials in the field, such problems [misalignment of roads] are bound to occur and are unlikely to have undergone official correction.

I’m not sure what to mock first, the leap to the conclusion that TIGER data problems are a result of aerial photography, or the related conclusion that GPS tracing is somehow superior to aerial photography. They are of course, closely related in the mind of the neotard: “Hmm, TIGER is old; aerial photography is old; TIGER sucks; aerial photography is old and expensive (so it sucks); therefore, the reason TIGER sucks is because aerial photography sucks!”

(Admission, I don’t know why TIGER positions are so bad in places. The answer may well be the source, but it cannot be hung off of aerial photography. Much of TIGER is sourced from lower levels of government then stitched in, so it is probably a pastiche of capture methods. I wouldn’t be surprised if some of it was digitized off of paper road maps. Or if some of it has not been positionally updated in 25 years.)

Paleotard

Oh, pity the poor paleotards, who don’t know any better, wasting good money flying about taking error prone “photographs”, instead of doing the smart thing and walking around with a Garmin. (Is that a Garmin in your pocket, or are you just happy to see me?)

I admit, I suffered from “GPS is magic” syndrome for quite a while, but I had the fortune to be exposed to people whose job it is to make base maps, who have to ensure that the lines they place on the map (in the database) are as close to “true” location as possible, given the technology available. That exposure taught me some useful things about source data collection, and one thing it taught me is that GPS traces are extremely hard to quality control.

The GPS has a very hard job to do. It has to read in multiple signals from satellites and calculate its location based on very, very, very small time differences. What happens when the signals intermittently drop because of trees overhead blocking the signal? Or bounce off of a nearby structure? The unit makes errors. Which would be fine, except it’s hard to isolate which observations are good, and which ones are bad. Too often, GPS track gathering falls back on heuristics that delete “zingers” (really obviously bad readings, determined visually or with a simple algorithm) and assume every other observation is “good”. If you delete zingers and take enough repeated traces, you can slowly get averaged lines that converge towards meter-level accuracy. However, the need for multiple traces radically increases the manpower/cost of gathering decent data, and the accuracy level does max out after a while.

The answer to getting good positional information over a large area is to tie together the strengths of GPS systems and the strength of remote sensing (aerial, satellite) systems.

  • Take a picture from above (or better, borrow one, from the USDA, or USGS, that has already been “orthocorrected”). This provides a very good source of relative measurement information: you can determine very precisely the distance and bearing between any points A and B. But it has no absolute positioning: how do I know the longitude/latitude of A and B?
  • Find several points at the edges of your photograph that are clearly visible from above and identifiable from the ground. Take your GPS to those points, and take long, long, long collections of readings (a hour is good) at each one. Take those readings home, and post-process them to remove even more error. Average your corrected readings for each point, you now have “ground control points”.
  • Use those control points to fit your aerial picture to the ground in some nice planar projection (like UTM). Digitize all the rest of your locations directly off the picture.

Take a deep breath. You are now a paleotard, but a happy, contented paleotard with very well located data.

This posting derives from a very interesting discussion on Geowanking from some months back. The question was “how do I map my two acre property at very high accuracy?”, and while the initial guess of the poster involved using a variation on GPS tracing, the best final answer was a hybrid solution.

Wheels Within Wheels

Sean says he supported Obama because he is more “electable “. So, what are we to make of this notice in the New York Times?

Senator John McCain’s presidential campaign said Thursday that it stood by a year-old pledge made with Senator Barack Obama that each would accept public financing for the general election if the nominee of the opposing party did the same.

This isn’t about campaign financing, since that issue would keep until after the Democratic nominee is decided. This is about pumping up Obama by treating him like the presumptive nominee, putting let more wind in his sails heading into Texas and Ohio, where an Obama win will put a stake through the heart of the Clinton campaign. Could it be the McCain campaign doesn’t share Sean’s take on who the more electable Democrat is?

Keep your Stinking Rasters Out of My Database

W00t! Yeah! Sing it brother!

http://spatialgalaxy.net/2008/02/15/rasters-in-the-database-why-bother/

Timmy's Telethon #4

Out of left field:

  1. Compatibility: This problem reaches across the board but when it comes to open source vs. closed source; from what you’ve seen is it a wash? I must admit that I’m inclined to stick with the devil that I and everyone else knows. Additionally doesn’t the nature of open source introduce opportunities for proprietary stovepipes?

This one I frankly do not understand, but perhaps it is in the nature of the word “compatibility”.

In general, open source software is wildly more interoperable and therefore “compatible” with different proprietary stovepipes than the proprietary alternatives. Stovepipes? Mapserver, for example, can pull data out of dozens of file formats, as well as SDE, Oracle, geodatabase, and the usual open source suspects like PostGIS. Because open source development priorities are “scratch the itch” and people in real offices need to do real work, one of the first features requested and funded is almost always “connect to my proprietary database X”.

(This is not just about Mapserver either: Geoserver, uDig, gvSIG, QGIS, even good old GRASS all have better multi-format connectivity than leading proprietary brands. Note the word “leading”… the non-leading proprietary brands tend to have good connectivity too, but the market leader uses lack of interoperability as a means to protect their lead.)

If “compatibility” is really a synonym for “does it work with ArcMap”, then indeed there is a problem, but it is not on the open source side. ESRI ties ArcMap tightly to their own stovepipe for very good (to them) reasons of competitiveness and market protection. ArcMap sells ArcSDE licenses, not vice versa. (How many times have you heard someone say “this SDE stuff is great! if only I had a mapping tool to work with the data…”)

I’m open to suggestions as to what part of open source “nature” actually lends itself to proprietary stovepipes. That part I don’t get at all.