Transparent PNG for IE6

I’ve been wrapping our web content in some new web designs, and one of the issues I have encountered is supporting transparent PNG on IE6. It can be done, and all it requires is a relatively unobtrusive hack that uses the IE-only “behavior” CSS attribute.

Stangely, though, while our web developer could serve pages that worked with this trick, when I implemented them on our own servers, it didn’t work! It took a while to realize that the problem wasn’t how I was implementing the hack (“check the URLs”, “are you line stripping the files?”, “make sure the files aren’t missing”) but rather from where I was serving the hack. Namely, from an old server running Apache.

IE6 would not execute the hack, which was bundled in an IE-only “behavior” file, with a .htc extension. It would load it, I could see that in the logs, but it never did anything. The problem was that my old Apache wasn’t serving up the .htc file with the mime-type that IE wanted on it.

So, one quick entry in /etc/mime.types and an Apache restart later:

text/x-component htc

And we’re golden.

Chemistry Change

I am a sucker for catastrophe, so I am fortunate to live in this age of (as the Chinese might say) “opportunity”. We are privileged to be witness to the twin cataclysms of peak oil and global warming. It may not have the immediacy and drama of the sack of Rome but, baby, it’s got size.

In the past year, I finally got to see “An Inconvenient Truth”, which really is worth seeing, even if you know all about global warming already, because it is such a clean compendium of all the issues, scientific, political, cultural, that we are navigating on our way into this crisis. It brings everything into a neat package, tied with a bow, a truly great work of documentary film making.

However, the real eye-opener for me in the last year was an under-appreciated article in the New Yorker, “The Darkening Sea” by Elizabeth Kolbert. The gist of the article is that by changing atmospheric chemistry (by injecting CO2 in higher concentrations) we are also changing oceanic chemistry (as the oceans absorb the extra CO2). The eye-opener is just how deep through the ecosystem this change in ocean chemistry reaches. I cannot recommend this article highly enough, it’s one of the best of the last couple years.

Counting Votes

This is orthogonal to geospatial, but c’est la vie. Apparently the United States’ quadrennial meltdown over how to count things is firing up again.

A big part of the problem appears to be the de-centralization of the US institutions responsible for conducting elections (in some cases each county gets to decide how to manage the electoral process) and another part is that the people responsible for running elections are themselves elected partisans (the state “secretaries of state”, for example Katherine Harris and Kenneth Blackwell).

Rather than address these structural issues at the root (have elections run by a single federal organization with leadership acceptable to both parties), the USA keeps trying to fix the problem with new and better machines.

For reference, here is how we do it in the civilized world.

British Columbia provincial elections are managed by Elections BC, an independent agency of the government, the head of which is chosen by an all-party committee of the legislature. Federal elections are managed by Elections Canada, ditto.

Each polling station handles a number of polls, each of which has a few hundred people in it. Each poll has its own vote box. Each ballot is numbered, and comes from a book of ballots, so that the number of ballots that end up in a box can be reconciled to the number of voters in a poll, and to the books which were assigned to each poll. The polling stations are staffed by temporary paid workers, two workers to each box, so the box is never unattended. Parties are allowed to have “scrutineers” in the station, who may observe the process. Parties often keep an independent count of how many people have voted, and who has voted (this information is fed back to the central campaign and used to drive get-out-the-vote efforts, by calling known supporters who have not shown up to vote earlier in the day).

Paper ballot in sealed cardboard box. It's that easy

At the end of the day, each box is audibly counted, under the eyes of the scrutineers and the poll supervisor. It takes about an hour to an hour and a half to count all the ballots in a poll. Because the scrutineers are keeping tallies during the count, boxes will sometimes get re-counted if the tallies don’t match.

The effect of having an open process (everyone gets to watch, right down to the individual votes for each poll counts) both increases confidence in the process and the accuracy of the result, because many eyes are working on the problem at once. The poll supervisor reports the totals to his superior in the district, who in turn reports to the central electoral authorities. And the scrutineers independently report the poll numbers to their party headquarters, where they are totaled up to provide the politicians with an early snapshot of the race. Multiple eyes, multiple paths for the data to flow, physical ballots for post-facto processing.

In the event of a really close election, things slow down a lot, because a panel of judges has to sit down in a room and hand-count the complete set of votes for the whole riding. This takes about a week. It’s the difference between parallel processing (each box counted simultaneously on election day) and sequential processing.

The reason such a primitive system can work better than all the fancy US computers is because the standards for things like ballots, handling ballots, counting ballots, box security, etc, are all set and managed centrally, by an independent agency (not an elections equipment vendor). It is basic logistics. Standard processes mandated and followed consistently make logistically difficult problems (like accurately gathering and counting 2 million individually marked ballots) achievable.

Enough with the machines, USA, get a decent organization and return to the pen and paper!

Open Source on a GSA Schedule

After much wailing and gnashing of teeth, Refractions secured a GSA schedule today (it’s not on their web site yet, but we just got the “contract is in the mail” notification)! For those not in the know, it’s worth an explanation.

GSA

GSA is the US General Services Administration, a catch-all purchasing authority for the US federal government. Government purchasing is difficult, because spending taxpayer’s money requires a good deal more transparency and fairness than needs to be exercised in the private sector. Doing “requests for proposals” is a lot of work, to write and evaluate, and many jurisdictions have routed around RFPs by creating the idea of a “standing order”, a pre-negotiated contract for specific products or services. GSA creates standing orders for the entire federal government, so having a GSA contract allows you to sell to a lot of clients with a lot less paperwork.

So this is a big deal for Refractions, and now opens the door to the question: can we sell enough open source (and other) geospatial services to keep our GSA contract? GSA contracts are a use-it-or-lose-it affair, so we have to hit minimum sales targets or they rip it up.

They Called It Off!

Good news from down Redmond way. The SQL Server spatial team has decided to make the coordinate order returned by their STAsText and STAsBinary functions consistent with the existing industry practice: (easting, northing) or (longitude, latitude) or (x, y), depending on how you look at it.

That kind of community responsiveness practically cries out for… a five pound box of chocolates! Note to Isaac and company: don’t eat them all in one sitting, no matter how tempting it might be!