04 Dec 2008
There are a couple of rate limiting steps of building libraries of free data, centralized in one place, for all to use. First, it costs a lot (for someone trying to provide a free service) to store and provide access to large amount of data. Second, if you don’t get enough eyeballs, the large effort in maintaining the library starts to feel pointless, and eventually you stop, driving the value of your collection eventually to zero.
Could the new Public Data Sets service at Amazon Web Services be a break from the dismal tradition of free data set aggregation? There is a limitation, in that you are meant to access the data sets from the AWS cloud (reducing Amazon’s bandwidth investment in hosting them), but with many apps migrating into the cloud anyways, this could be quite a boon.
From TechCrunch.
04 Dec 2008
So, I was at a client site this week, doing a few days of reviewing their application and providing advice on PostGIS design, Mapserver performance, all my favorite things. And we come to the last day, and they say “you’ve been talking about how our application would look so good if we used OpenLayers and ExtJS and how great those tools are so… how about you mock up a little data entry application using our data for us this morning, before your flight?”
Glurp!!
I’m not much of a web programmer, but fortunately OpenLayers and MapFish have adopted a policy of “documentation by example”. OpenLayers is by far the leader in this, eschewing tutorials in favor of a long list of tiny example pages, each one demonstrating a discrete unit of functionality.
Since “mediocre authors borrow, great authors steal”, I set about finding something I could steal that would get me closer to my goal. Fortunately, I quickly found want I wanted in the client code base of MapFish – Mapfish is an ExtJs/OpenLayers framework, so it had the components I was yapping about, and it included a simple editing example.
Starting from there, hooking up the client’s map services, using the OpenLayers examples to grab some extra layer types, and adding a few buttons, I had the desired proof of concept in plenty of time to make my afternoon flight. Thanks MapFish and OpenLayers, for making me look so damned good!
01 Dec 2008
Four bids this year, from the, er, three corners of the earth. China, North America, and Europe (times two, Barcelona and Utrecht). Apparently sated after winning bids from South Africa and Australia, the antipodes are not represented this year.
My criteria will, as always, tend towards the practical matters of putting on a financially successful conference. Is there a track record of past performance? Is the local organizing committee well connected and able to bring in local sponsorships? Has the bid thought about some of the practicalities of putting on a conference for many hundreds of people? Is there a strong leader attached to the bid who can cut through committeeitis and get decisions made quickly? Of course, my own criteria will be subordinated to the overall conference committee criteria, and the process described therein, but that’s where I tend to come from. I like to see a bid that makes me think “yes, those people could put on a half-million dollar conference with no problems at all”.
One thing I hope we will manage well this year is getting a good discussion going during the evaluation phase with the proponents. In 2009 it didn’t matter, because we had just one bid, but in 2008 we didn’t really get any IRC chat going or really any back and forth, and I didn’t feel like I really knew who the proponents were.
30 Nov 2008
Governments around the world are poised to unleash untold billions of dollars in spending on infrastructure, in stimulus packages that hope to cushion the landing we are currently plummeting towards. Ron Lake posits that the initiatives will require a meta-investment, that
We are going to need to invest in infrastructure (information infrastructure) for infrastructure!
It is probably not a good sign that I, a professional technologist, am chilled to the bone by the prospect. Information infrastructure historically has an obsolescence period that does not stack up well against physical infrastructure. Bridges last fifty, a hundred, sometimes a thousand, years. Government IT is planned for replacement in a decade, and occasionally pushed out as far as two decades in oddball cases.
It got me trying to think about the government IT investments that have shown longevity and been repeatedly leveraged by the economy over the years. And the two obvious ones seem to be data and open source software. The topographic mapping done by the US government and published by the USGS has been used and re-used so many times in so many contexts, the investment has been repaid many many times over. Same thing with the TIGER data. Investing in very core data sets and making them freely available seems to generate knock-on economic effects for years.
The same thing has been happening with investment in core geographic software libraries. The Proj4 reprojection library had its genesis at the USGS, and has been integrated into many many pieces of software, both open and proprietary. The JTS/NTS/GEOS geometry library (genesis in Canadian government funding) now lives inside many open and proprietary software packages.
The difficulty is perhaps in distinguishing what pieces of data and software are “core” and can get maximum leverage and re-use over time, and are therefore worth “investing” in. It’s not as obvious as in physical infrastructure what pieces of software and data are “roads” and which ones are “trucks”.
27 Nov 2008
Is this “neogeography”? At the minimum, it’s someone seeing a powerful tool at his disposal and pressing it into service, rather than waiting for ESRI to release a suitably branded version. Check out Smathermather, scroll right to the bottom and work your way up. Faaaabulous!