Mike Pumphrey over at the Geoserver blog has written a short post about this year’s Geoserver-vs-Mapserver comparison. I hope we can maintain this study as an annual event, and even get someone with an ArcServer license to join in the fun. Each iteration finds new areas that need work and resets the bar better and better every year.
Basically, there are some differences that are small, and ignorable, and there are some differences that are really anomalous. And the end of the day, both systems are doing the same thing, so order-of-magnitude performance differences are cries for help.
I’ve been focussing on the Mapserver side. Last year, the study by Brock and Justin found an odd quirk where Mapserver got progressively worse at shape file rendering as the shape files got bigger. I found the issue and fixed it this spring, and (w00t!) Mapserver won the shape file race this year.
But… this year found that the PostGIS performance in Mapserver was (while fast) about half as fast as Geoserver. Hmmmm. So I know what I’ll be working on this month. I have some guesses, but they will need to be tested.
Andrea added some aesthetic tests this year, and brought them to the attention of the Mapserver team, and as a result the next release of Mapserver will include more attractive labeling results and line width control.
Any development team that’s willing to swallow their pride (because for every test you win, there’s one you’ll lose) can get a lot of benefit in joining in this benchmarking exercise.
And your enemies closer. It seems ESRI has yet to learn that particular piece of the wisdom of Sun-tzu, and that’s too bad. By excluding “competitors” that are very small compared to the overall marketplace, ESRI is being penny-wise and pound foolish. Sure, open source will steal a few accounts here and there, but the real prize is to co-opt them into your ecosystem, where you can keep an eye on them, a lesson Microsoft has clearly learned.
I’m a cowboy. I like to just slap a brand on the cattle and push them out the gate. Sometimes this gets me in trouble.
Jesse Eichar, the uDig project lead, is not a cowboy. The 1.1.0 release comes after a series of 14 (fourteen) “RC” versions and three “SC” versions. Congratulations to Jesse, and to Jody and Andrea and other uDig team members, on “going gold” with the 1.1 release. Remember, if things aren’t perfect, there’s always 1.1.1!
One thing watching the uDig development process has taught me over the years is how much harder user-facing applications are than server-side ones. The number of places you can “get it wrong” is orders of magnitude greater. The number of ways you can fine tune and fine tune and fine tune a particular piece of interaction is almost infinite (the editing tools are something like major revision four since the project started, and I’m sure there will still be things to be changed and fiddled with, given the hyper-modality and hyper-interactivity of editing). It has given me a lot more respect for the people writing web browsers and word processors and all the other virtual tools that we use every day. And now I automatically quadruple estimates that involve user interfaces, instead of merely doubling them as I used to.
The “cost of the bailout” has been a big election meme south of the border, and I continue to be flabbergasted at how primitive the media discussion of the issue has been. The first debate event began with a question that essentially said “given the $700B cost of the bailout, what parts of your campaign platform would you cut to pay for it?”.
How about: none of it. How about: tell you what, I’m going to spend more.
The US of A is going to sell $700B worth of Treasury Bills (bonds) to various countries, institutions and people – China, Saudi princes, sovereign wealth funds, foreign banks, and so on. For short, we’ll call them “the Suckers”. These bonds are going to pay the Suckers some absurdly low interest rate, like 2% or less.
The US of A is then going to turn around and exchange the $700B it got from the Suckers for preferred shares in the banks, which will pay 5% for the first five years, and 10% after that.
So, rather than costing the benighted tax payers of the USA anything, this “bailout” is actually going to be netting the Treasury $700B * 3% = $21B a year. The only people with anything to complain about might be the Suckers, and it’s not like anyone is forcing them to buy Treasury Bills.
Why is this rather elementary fact not finding its way into the political discussion of the “bailout”? Too much math?
Martin Davis just posted about his improvements to the JTS buffering routines, speeding up buffering by a mere factor of 20 or so.
Martin has also added some improvements in the area of unions for large sets of geometries, a technique he calls “cascaded union”. It too is good for orders-of-magnitude performance improvements.
Do you have PostGIS queries of this form:
If you do, then getting Martin’s JTS algorithms ported to GEOS (the C++ geometry library used by PostGIS) will make your database run faster. Lots faster.
How can you help that happen? Become an OSGeo “Project Sponsor “ for GEOS. Project sponsor commit a modest sum to the ongoing maintenance of the code, which is generally used for hiring a maintainer to do things like ensure patches are properly integrated, that tests are added for reliability, and that upgrades like the ones Martin has created get folded into the code base in a timely manner.
If you’re interested in sponsoring GEOS development, please get in touch with me. If you are using PostGIS in your business, it is money well spent.