Timmy's Telethon #2

Moving on:

  1. Risk: Our landscape is changing so fast there is an extreme amount of exposure to dumping resources into a solution that isn’t supported. If any one component of the enterprise stack changes you’re vulnerable and I trust those I’m paying to cover my maintenance than I do an ethereal community. When it comes to supporting my clientele I need tangible support resources. How good are the support resources for open source solutions? Is there comprehensive up to date documentation? Can I call someone in an a oh $*!&% moment?

I think this article at InformationWeek (number three in the Google search for “open source support”) sums it up well. There is not one answer, there are a number of answers, and you need to choose the one(s) that make sense for your needs.

  • Product support, from specialist companies with expertise in particular components (Refractions for PostGIS, DM Solutions for Mapserver, TOPP for Geoserver)
  • Stack support, from generalist companies putting together mixes of components (Wheregroup)
  • Community support, much maligned, but better than the technical support provided with most proprietary packages
  • Training, from folks like the companies above, or specialist trainers like Open Technology Group
  • Hiring project developers, an often unappreciated source of top notch trouble shooting and knowledge,
  • Consultants, who have to know the tools they use

For me, the money quote is:

CIOs would be well-advised not to buck the open-source trend. On the contrary, they should assume responsibility for open-source initiatives and ensure that their companies have the right support structures in place… They’ll find the more mature an open-source software project, the more mature support options its users enjoy.

Open source requires you to assume responsibility, which is hard for an organization man, with years of CYA behind him, to do. In exchange for taking responsibility for your own infrastructure, you are rewarded with a software ecosystem where there is more than one source of support.

What do open source organizations do when their support provider isn’t up to snuff? They get a new support provider.

What do ESRI customers do when ESRI support isn’t up to snuff? They bitch about it on James Fee’s blog.

Timmy's Telethon #1

So, addressing Timmy’s concerns about open source geospatial:

  1. Staffing: The specialized skills necessary to build and maintain an open source app are hard to come by. There is a premium on any specialization, is the talent pool to build and support these open source solutions deep enough to maintain continuity in staff skills?

There is, as with any great argument, a kernel of truth in this item, but it is wrapped in a thick, low-calorie, blanket of misdirection, like a corndog at a state fair.

So, should you be concerned about staffing your open source application? You should, to the extent that:

  • the skills required to understand and maintain it take a long time to learn, and
  • the skills required to understand and maintain it are in short supply.
  • Note that you have reason for concern only if both conditions occur: the skill must be both difficult to learn and in short supply.

Timmy sees that, compared to proprietary toolsets, people with prior experience with open source tool sets are fewer and farther between, and leaps to the conclusion that there is a skills provision risk.

However, the skills necessary to work with open source geospatial applications are either easy to pick up quickly, or transferable from other domains.

  • PostGIS: Already worked with Oracle Spatial or ArcSDE’s “new” spatial SQL feature? You already know PostGIS.
  • Mapserver: Learn the .map file and you are good to go. No harder than picking up enough AXL to be useful. Budget a couple days of learning time.
  • OpenLayers: Already worked with Google Maps? You’ve got the concepts down pat. You’d better know Javascript, but that’s a transferable skill and you’ll need that for any non-trivial application.
  • Geoserver: Point and click through the interface. Do you known enough to deploy a WAR into production? If you installed ArcIMS, you already do.

The slight disadvantage open source has in providing decent tutorial-level guides for new users is offset by the advantage in access to a very helpful user community and direct access to the development community.

Summary: No matter whether you’re building on ESRI or open source, if you are building something complex your staff will have to learn a few new skills. Their prior experience with core concepts like programming and IT will serve them well in both domains, and the learning curve will be no worse either way.

Caveat: It’s possible Timmy is talking about people who will only learn a new skill if force fed it through a training course, and even then will only retain 50% of what they are taught, who write point-and-click recipes and stick them up on their cubicle walls, who think that “re-boot the server” is a genuine solution. If those are the people whose “skills” he is worrying about, I can only say “go with God, Timmy, peace be unto you, you have larger problems than proprietary vendor lock-in”.

Weblicious

I’ve spent a fair amount of time over the last month re-doing three web sites:

Some of it, like the PostGIS site, has been easy, simply re-skinning the existing content (though I’m anticipating reworking the content there to make it more newbie friendly and harmonized). Other bits, like the uDig site, are brand new, and include some big additions like a gallery of projects using uDig. And the Refractions site includes piles of new content like case studies, that have taken many days to write up. And I’m only about 50% through my list of candidates.

PostGIS users will find this nugget in the Refractions site fun: a potted history of PostGIS.

Timmy's Telethon #0

In the comment thread at James Fee’s posting on building an open source application in an “ESRI” shop, “timmy” provides the most complete laundry list of incumbent vendor objections to open source I have seen in some time.

The list is far too comprehensive to do in one post, so I’ll do them one at a time.

As a general note, many of the items are not really specific to open source or geo-spatial – they could be used by any incumbent market-leading vendor to attack a smaller competitor.

There is also an apples/oranges thing going on here, since the default GIS vendor (ESRI) is at a different point in the technology adoption cycle than open source. Open source can’t strongly appeal (yet) to conservative late adopters, and ESRI is finding it hard (at the moment) to appeal to technically savvy early adopters. (Technology book recommendation: Crossing the Chasm is a must-read for anything thinking about the software market.)

Magick!

I’ve always like ImageMagick, and frequently it is the first thing I install when I set up a new computer. For OS/X, I have found that MacPorts make the installation pretty painless, although it takes a while to compile all the dependencies.

Like most ImageMagick users, I have rarely scratched the surface of what this toolkit can do – I mostly use it for simple format conversions and image re-scaling. However, I recently had a image processing problem that went beyond the ordinary – I wanted a general purpose tool that would take any input image, scale it to 200 pixels wide, and create nice rounded corners, with transparency where the pixels used to be.

Basically, to go from this:

To this:

Since I plan on doing this a lot, I don’t want to fire up a graphics program every time and point-and-click my way to nirvana. Enter ImageMagick, my old friend!

#!/bin/bash

# Usage:
# ./roundclip.sh [inputfile]

# Output width
OUTPUTWIDTH=200
# Corner size
CSIZE=20

IMG=$1
EXT=${IMG##*.}
BASE=`basename $IMG .$EXT`
OUTFILE=PNG8:${BASE}_round.png
TMPFILE1=/tmp/${BASE}-1-$$.png
TMPFILE2=/tmp/${BASE}-2-$$.png
MASK=/tmp/mask-$$.png
TRANSPARENT=/tmp/transparent-$$.png

# Scale the input down to our desired width
convert $IMG -scale ${OUTPUTWIDTH}x $TMPFILE1

# Find out the height and width of the working file
DIM=`identify -format %wx%h $TMPFILE1`
W=`identify -format %w $TMPFILE1`
H=`identify -format %h $TMPFILE1`

# Calculate the lower corner coordinates
X=$(($W - 1))
Y=$(($H - 1))

# Make a clipping mask with rounded corners
convert -size $DIM xc:black \
    -fill white \
    -draw "RoundRectangle 0,0 $X,$Y, $CSIZE,$CSIZE" \
    $MASK

# Make a transparent underlay
convert -size $DIM xc:transparent $TRANSPARENT

# Place the masked input image onto the transparent underlay
composite -compose src-over $TMPFILE1 $TRANSPARENT $MASK $TMPFILE2

# Convert to the output format, and do some color reduction
convert $TMPFILE2 -quality 90% $OUTFILE

# Clean up the temporary files
/bin/rm $TMPFILE1 $TMPFILE2 $MASK $TRANSPARENT

There are probably much more efficient ways to do this, with fewer intermediate steps, but I am not a guru yet.

Using other drawing and blurring techniques, it’s also possible to create drop-shadows on the fly too…