Bringing ICM to the Natural Resources Sector

What could possibly go wrong?

Bringing the unalloyed success that is ICM to the rest of government is a no brainer! (Does that mean you need no brain to recommend it, or no brain to accept the recommendation? What an odd phrase.) The strategic plan for the BC natural resource sector includes a case (authored by whom, I wonder) for “re-using” the ICM technology in natural resource permitting.

The business case will go to Treasury Board some time this year, and we’ll soon be off to the races, with a huge capital allocation for systems development that can only be bid by a major multi-national. Thanks to cabinet secrecy and third party commercial exemptions to the BC FOI law, the taxpayers footing the bill for this extravaganza will get to see neither the full business case, nor the winning bid, nor the billing schedules of the winning bidder.

If we’re lucky we’ll get to see a redacted PowerPoint presentation given to Cabinet, which ironically enough will leave us just about as educated on the issues as the actual decision makers pulling the trigger on this new adventure. Ain’t technology decision making grand?

Victoria gets Portal'ed

For years, I just assumed that, as a cash strapped municipality carrying much of the service load of the region (Victoria is the small (80K) urban centre of a metro area (330K) that comprises ten separate municipalities) my city would just never have enough money to splash out on a mapping webstacle, but today I was proven wrong.

I give you: VicMap

It’s got everything you expect from a “municipal map portal”. (For explanation of the scare quotes, please see parts 1, 2, 3, 4 and 5 of “why map portals don’t work”.)

It’s got the Silverlight requirement, and associated “loading” gadget.

It’s got the pop-up disclaimer dialogue.

And it’s got the menu of all available layers.

It’s also got a task-oriented attempt to match usage to users, a good attempt, marred somewhat with GIS terminology (“identify”, anyone?).

Most of the data is already on the city open data site except the aerial photography. The only positive I can see is that the site is built with local technology.

Anyhow, now that we have a “new shiny”, I hope the City follows up and keeps track of who uses it and how, and changes appropriately. I’ve long noted the popularity of the CRD online atlas (Capital Regional District), another “portal par excellance”, but even my limited sampling has turned up a 100% use case of “viewing the hi-res imagery and parcel outlines” – the rest of it could be thrown out, or hidden in a file drawer somewhere.

Keeping metrics and honestly evaluating the utility of the site are important next steps, and not being beholden to the all-in-one framework are important. The site is probably not being used for what you think it is, and your users might be better served not with one big slow map, but several small fast ones.

But you won’t know unless you keep track of:

  • Usage patterns day by day, hour by hour, region by region
  • Usage of layers, which are popular and when
  • Usage of features, which are often used, which are never used
  • Usage of locality, where do people look most often, and at what layers

With that information in hand, the one big map can be taken apart into pieces that address the actual needs of users as efficiently as possible.

BC IT in the New New Era

Every change of government provides an opportunity for substantial change in either broad or particular areas of policy, even ones that don’t involve a change in party. On taking the Premiership two years ago, Christy Clark ratified and amplified the nascent open data policies her predecessor had been toying with, which for techno-weenies (guilty as charged) constituted a significant and important positive change. (Non-techno-weenies will note the degradation in overall support for open government in general, but we’ll ignore them because they don’t like computers.)

Now, after a surprise victory, taking full control of her party and vanquishing her enemies both outside and within her party, will Christy Clark bring in any big changes in the world of BC government IT? My guess is “no”, or “yes, if a lot more of the same constitutes a change”.

The big tell is in today’s detailed news release on the new cabinet which in addition to the political leadership (yawn!) details the roster of Deputy Ministers, the administrative leadership.

Tell #1: Bette-Jo Hughes becomes the CIO. No longer acting (?), the architect of the IBM desktop services (we use the term loosely) agreement continues at the OCIO. So, continuity of policy in the boring but important worlds of procurement and standards, and presumably in major initiatives like the BC ID card as well.

Tell #2: Former CIO Dave Nikolejsin is plucked from purgatory at the Environmental Assessment Office and plunked down as DM of Energy and Mines (and Core Review). This one I think is far more interesting.

The “core review” gets to poke its fingers into any and every program in government. What will the man who brought us ICM, the BC Services Card, and the upcoming “one stop” natural resources system rebuild do when placed in charge of a review of all government programs?

I don’t know, but I have a theory: it will somehow involve Deloitte & Touche. Nikolejsin’s other initiatives have all been big ticket, vendor-led enterprises, so it seems reasonable to conclude that D@T’s blend of management consultantese and IT bafflegabery will prove irresistible in this case too. Think of all the business processes that can be streamlined and integrated with a mere $100M in IT capital spend! The savings on paperclips alone…!

One thing I’m fairly certain of, there will be no shortage of IT boondoggles to write about over the next four years. With old boondoggles still en train (ICM), older boondoggles rebooting (BCeSIS), new boondoggles in the works (natural resources) and top-secret boondoggles waiting to come to light (eHealth), all of them 9-figures and up, the world of BC government IT will not disappoint.

BC ICM Footnote

Early this year, the ICM project released a system assessment report from Queensland Consulting which was seized on by the BCGEU and others as showing ICM to be “deeply flawed and serious question remain regarding the software’s suitability for child protection work”.

I found two things curious about the Queensland report: first, it was willing to talk about the failing of COTS methodology in a way that is rare in the stodgy conservative world of enterprise IT; and second, a large number of the process recommendations were reiterations of a previous report, a “Readiness Assessment” by Jo Surich, Victoria tech eminence grise and former BC CIO. If the Surich report was so good that Queensland was quoting from it rather than writing their own material, I wanted to read it, so I filed an FOI.

You can read the full Surich report now, on the BC open info site.

I found two items of particular interest in Surich’s report.

First, the date. Surich reported out in April of 2011. Over a year later, the Queensland consultants were re-iterating many of his recommendations. Surich was not listened to. Since a second set of consultants saw the same problems Surich did, letting his recommendations gather dust was a tactical error, to say the least.

Second, the big picture plan. Surich notes that, in addition to replacing all the tracking and reporting systems used in child protect, the Ministry was simultaneously changing the practice model (the “business process”, in the usual IT terminology) that social workers use for their cases.

Simultaneously changing the business process and technology is a time honoured and widely replicated failure mode in enterprise IT development, because it makes so much sense. If you’re changing the business process, you’ll need to change the systems to match the business process. So, why not replace the systems at the same time as you change the business process?

Lots of reasons!

  • Your mutating business process will constantly change the system requirements underneath you, resulting in lots of back-tracking and re-coding.
  • You won’t know if your business process is bad until you deliver your system, which will then require further system changes as you again alter the business process.
  • You double down on changes your staff need to ingest, in both tooling and methodology, and triple down as you add in fixes to business process after deployment.
  • It’s been done before, and it’s led to some epic, epic failures.

As I learn more about the background to ICM, I have to ask myself if I would have done any differently. Particularly given the timelines and promises that backstop the huge capital commitment that gave birth to ICM, I find myself saying “no”, I’d have made the same (similar) mistakes. I’d have walked down a very similar path. Probably not using COTS, but still trying to do business process and technology at once, trying to deliver a complete replacement system instead of evolving existing ones. Taking a set of rational, correct, defensible decisions leading down a dead-end path to failure.

The Microsoft Era

In many ways, the “Microsoft Era” has been over for quite some time. The most important developments in technology, the ones that change the way we work as IT practitioners, have been coming from other organizations for at least five years, maybe a decade (first open source, then the cloud companies).

But Microsoft has held on, and even garnered some of the aura of the scrappy underdog as it tries to compete in the new “network is the computer” (so close, Sun: right slogan, wrong decade) world. The reason Microsoft has been able to hang in this fight for so long is the continuing “price of doing business” revenue it has been able to extract from its operating system and office automation franchises.

Why, despite its continuing technological failures to deliver useful new functionality into its offerings, is Microsoft still hanging in there, still receiving billions of dollars a year from operating system and office software?

Because it’s good enough, and because there is no alternative.

More realistically, because we BELIEVE it is good enough, and there is no alternative. As long as we believe that, we won’t spend any time evaluating alternatives to see if they too are good enough.

It is the self-reinforcing belief that Microsoft produces and supports good enough software, and has the business continuity to CONTINUE to produce and support good enough software, that allows conservative IT managers to get to sleep at night, safe in the knowledge that they have backed a winner.

But what if they are backing a loser? Or, more to the point, what if they begin to BELIEVE they are backing a loser?

They are going to start looking for alternatives. And suddenly Microsoft will BE a loser. And the feedback loop will intensify.

All this to emphasize the importance that even ZDNet, yes, ZDNet is starting to lose faith in the market dominance of Microsoft.

I think Microsoft could continue to dominate the important, but no longer growing, desktop market for years, even decades to come. However, I don’t think they will.

The analysts have already tracked the decline of Windows relative to tablet and phone operating systems. The CIOs are working on “bring your own device” policies, which will liberate countless desktops from Microsoft monoculture. The trends are not good, and as the trends are publicized more and more, they will only get worse.

Bye bye Microsoft, I wish I could say I’ll miss you.