Tuesday, October 27, 2015

Government email deleting: intent matters

Keith Baldrey I caught Keith Baldrey on the aether-box today (CKNW) and he was being generous in his distribution of benefit of the doubt to the poor, poor government staffers trying to handle their email:

“I’ve talked to government staffers about this, and they are confused on what the rules are, it’s very unclear and unevenly applied over what should be deleted and what should not be.”
— Keith Baldrey, Tuesday, October 27, 15:24 on CKNW

Before we get to remedies, let's review what these poor confused dears are doing. For whatever reason, because they believe the email is not an important record, or a duplicate, or they just can't bear to burden the taxpayers of BC with storing a further 85KB of data, the beleaguered staffers are doing the following:

  1. They select the email in question and hit Delete.
  2. Then they go to their Trash folder and select the option to purge that folder.
  3. Finally they open up a special folder called Recover Deleted, and select the option to purge that folder.

Let's be clear. If the poor confused staffers were just plain vanilla innocently deleting emails that they thought were transitory but were not, they would be stopping at step number one. But they aren't. So there's a very particular intent in play here, and that's to make sure that nobody ever sees what's in these emails ever, ever, ever again. And that intent is not consistent with the (current) cover story about innocently not understanding the rules in play with respect to email management.

Moving on to remedies.

We don't need to train them more (or maybe we do, but not for this). We need to establish a corporate email archive that simply takes a copy of every email, sent and received and dumps it into a searchable vault. This is widely available technology, used by public companies and investment dealers around the world.

Once the archive is in place, staffers can manage their email any way they like. They can keep a pristine, empty mail box, the way Minister Todd Stone apparently likes to operate. Or they can keep a complete record of all their email, ready to search and aid their work. Or some happy mixture of the two. They'll be more effective public servants, and the public won't need to worry about records going down the memory hole any more.

Let's get it done, OK?

Saturday, October 24, 2015

If I hear the words "triple delete" one more time...

... I'm going to tear my ears off. Also "transitory email". Just bam, going to rip them right off.

Note for those not following the British Columbia political news: While we have known for many years that high-level government staff routinely delete their work email, a smoking gun came to light in the spring. A former staffer told how his superior personally deleted emails that were subject to an FOI request and then memorably said "It's done. Now you don't have to worry anymore." (A line which really should only be delivered over a fresh mound of dirt with a shovel in hand.) The BC FOI Commissioner investigated his allegation and reported back that, yep, it really did happen and that the government basically does it all the time.

The Microsoft Outlook tricks and the contortions of policy around what is "transitory" or not, are all beside the point, since:

  1. there is no reason electronic document destruction should be allowed, in any circumstance, ever, because
  2. electronic message archival and retrieval is a solved problem.

The BC Freedom of Information Act, with its careful parsing of "transitory" versus real e-mails, was written in the early 1990s, when there was a tangible, physical cost to retaining duplicative and short-lived records -- they took up space, and cost money to store.

Oh, yes, digital documents cost money to store, but please note, my old CD collection (already a very information dense media) takes up a 2-cube box in my garage, but barely dents the storage capacity of an $10 memory stick in MP3 form. My book collection (6 shelves) hardly even registers in digital form. You use more data streaming an episode of Breaking Bad. Things have changed since 1995. And since 2005.

So why are we still having this conversation, and why does the government have such lax rules around message retention? And let me be clear, the government rules are very, very, lax.

In the USA, public companies are under the Sarbanes-Oxley rules and have extremely strict requirements for document retention, with punishments to match:

"Whoever knowingly alters, destroys, mutilates, conceals, covers up, falsifies, or makes a false entry in any record, document, or tangible object with the intent to impede, obstruct, or influence the investigation or proper administration of any matter within the jurisdiction of any department or agency of the United States or any case filed under title 11, or in relation to or contemplation of any such matter or case, shall be fined under this title, imprisoned not more than 20 years, or both."

Similarly, in Canada investment companies must keep complete archives of all messages, in all kinds of media:

Pursuant to National Instrument 31-103 ... firms must retain records of their business activities, financial affairs, client transactions and communication. ... The type of device used to transmit the communication or whether it is a firm issued or personal device is irrelevant. Dealer Members must therefore design systems and programs with compliant record retention and retrieval functionalities for those methods of communication permitted at the firm. For instance, the content posted on social media websites, such as Twitter, Facebook, blogs, chat rooms and all material transmitted through emails, are subject to the above-noted legislative and regulatory requirements.
— IIROC Guidelines for the review, supervision and retention of advertisements, sales literature and correspondence, Section II

Wow! That sounds really hard! I wonder how US public companies and Canadian investment dealers can do this, while the government can't even upgrade their email servers without losing 8 months worth of archival data:

As it turned out, the entire migration process would take eight months. When the process extended beyond June 2014, MTICS forgot to instruct HPAS to do backups on a monthly basis. This meant that every government mailbox that migrated onto the new system went without a monthly backup until all mailboxes were migrated. Any daily backup that existed was expunged after 31 days. At its peak, some 48,000 government mailboxes were without monthly email backups.
— OIPC Investigation Report F15-03, Page 32

Corporations and investment banks can do this because high volume enterprise email archiving has been a solved problem for well over a decade. So there are lots of options, proprietary, open source, and even British Columbian!

Yep, one of the top companies in the electronic message archiving space, Global Relay, is actually headquartered in Vancouver! Guys! Wake up! Put a salesperson on the float-plane to Victoria on Monday!

Right now, British Columbia doesn't have an enterprise email archive. It has an email server farm, with infrequent backup files, retained for only 18 months and requiring substantial effort to restore and search. Some of the advantages of an archive are:

  • The archive is separate from the users, they do not individually determine the retention schedule using their [DELETE] key, retention is applied enterprise-wide on the archive.
  • Archive searches are not done by users, they are done by the people who need access to the archive. In the case of corporate archives, that's usually the legal team. In the case of the government it would be the legal team and the FOI officers.
  • Archive searches can address the whole collection of email in one search. Current government FOI email searches are done computer-by-computer, by line staff who probably have better things to do.
  • The archive is separate from the operational mail delivery and mail box servers, so upgrades on the operation equipment do not affect the archive.

So, for the next little while, the Commissioner's narrow technical recommendations are fine (even though they make me want to tear my ears off):

But the real long-term technical solution to treating email as a document of record is... start treating it as a document of record! Archive it, permanently, in a searchable form, and don't let the end users set the retention policy. It's not rocket science, it's just computers.

Thursday, October 15, 2015

Keynote at FOSS4G 2015

On my usual bi-annual schedule, I gave a keynote talk at FOSS4G this year in Seoul, about the parallel pressures on open source that the move to cloud computing is providing. On the one hand, the cloud runs on open source. On the other hand, below the API layer the cloud is pretty much the opposite of open: it's as much a black box as the old Win32 API. And the growth of cloud is paralleled by the shrinkage of infrastructure maintainers in other venues; the kinds of folks who currently use and produce OSS. It's a big change coming down the highway.

Keynote Lecture 5: Where do we go from here? The next 10 years of open source geospatial — Paul Ramsey from FOSS4G on Vimeo.


Friday, October 09, 2015

Krugman FTW

"Sometimes I have the impression that many people in the media consider it uncouth to acknowledge, even to themselves, the fraudulence of much political posturing. The done thing, it seems, is to pretend that we’re having real debates about national security or economics even when it’s both obvious and easy to show that nothing of the kind is actually taking place."
Paul Krugman

Monday, August 10, 2015

Big Data and Data Science Piss Me Off

Get off my lawn!

I don't talk about this much, but I actually trained in statistics, not in computer science, and I've been getting slowly but progressively weirded out by the whole "big data" / "data science" thing. Because so much of it is bogus, or boys-with-toys or something.

Basically, my objections to the big data thing are the usual: probably your data is not big. It really isn't, and there are some great blog posts all about that.

So that's point number one: most people blabbing on about big data can fit their problem onto a big vertical machine and analyze it to their heart's content in R or something.

Point number two is less frequently touched upon: sure, you have 2 trillion records, but why do you need to look at all of them? The whole point of an education in statistics is to learn how to reason about a population using a random sample. So why are all these alleged "data scientists" firing up massive compute clusters to summarize every single record in their collections?

I'm guessing it's the usual reason: because they can. And because the current meme is that they should. They should stand up a 100 node cluster on AWS and bloody well count all 2 trillion of them. Because: CPUs.

But honestly, if you want to know the age distribution of people buying red socks, draw a sample of a couple hundred thousand records, and find out to within a fraction of a percentage point 19-times-out-of-20. After all, you're a freaking "data scientist", right?

Wednesday, July 15, 2015

BC IT Outsourcing 2014/15

If what goes up must come down, nobody told BC's IT outsourcers, because they continue to gobble up a larger chunk of the government pie every year.

The BC Public Accounts came out today, and I'm happy to say that the People Who Are Smarter Than You Are managed to book another record year of billings: a $468,549,154 spend, up 8% over last year.

It's not a victory unless you beat someone else, so good news:

  • Overall government revenue, up 5.4%
  • Overall government spending, up 2.4%
  • Health spending, up 2.8%
  • Education spending, up 0%
  • IT services spending up 8%!!!!

Don't be sad, kids and sick people, IT services folks are Adding Value and Finding Synergies in ways that you just can't. In the long run, workshopping the new Management Strategy Realignment Plan is just a better investment than fixing your gimpy hip, or hiring a teaching assistant to help Angry Jimmy focus on his work.

HP Advanced Solutions continues to dominate the category, adding $20M in billings this year alone (How many teachers could that hire? At least 200. Or even more teaching assistants.) In fact, two thirds of the billing growth this year was just HP.

There's also a new kid in the enterprise software vendor list to keep an eye on: Salesforce.com (SFDC) showed up with a wee $463,053 in billings this year. I expect that to increase mightily in coming years. However, the big money in SFDC work will not be earned by SFDC (even after locking up the entire BC government enterprise back-office, Oracle bills less than $10M a year in software maintenance), but by the consultants providing SFDC "implementation services" (Deloitte, CGI, HP). Watch for a SFDC goldrush as the government starts replacing expensive Oracle systems with... expensive SFDC systems in the cloud.

The best part about hiring big public companies enterprise IT like HP, Oracle, Maximus, and CGI to create lots of important Technology Process (and occasionally a bit of Product) for us isn't the soothingly glacial pace of progress or the fantastic billing rates. It's knowing that at least 20% of every public dollar spent goes straight to the bottom line of those companies, ensuring that shareholders and institutional investors survive through another year without undue financial hardship.

Until next year, keep on spending, British Columbia!

Monday, April 27, 2015

More Speech for Money

The BC Liberal government is changing the Elections Act to allow unlimited party and candidate spending within one month of election day and meanwhile, as usual, the media are transfixed by the shiny object in the corner.

The political pundits are making a great deal of noise (see V. Palmer's inside baseball assessment if you care) about an amendment to the Elections Act that says that:

"the chief electoral officer must provide … to a registered political party, in respect of a general election … a list of voters that indicates which voters on the list voted in the general election"

At the same time, they are ignoring the BC Liberals fundamentally changing the money dynamic of the fixed election date by eliminating the 60-day "pre-campaign" period.

"Section 198 is amended (a) by repealing subsections (1) and (2) and substituting the following: (1) In respect of a general election, the total value of election expenses incurred by a registered political party during the campaign period must not exceed $4.4 million."

The Elections Act currently divides up the election period before a fixed election into two "halves": the 60 days before the official campaign, and the campaign period itself (about 28 days if I recall correctly). In the first 60 days, candidates can spend a maximum of $70,000 and parties a maximum of $1.1 million. In the campaign period, candidates can spend another $70,000 and parties as much as $4.4 million.

The intent of the "pre-campaign" period is clearly to focus campaigning on the campaign period itself, by limiting the amount of early spending by parties. The "money density" of the pre-campaign period is about $18,000 / day in party spending; in the campaign period, it is almost $160,000 / day.

This is all very public-spirited, and contributes to a nice focussed election period. But (BUT!) the BC Liberals currently have more money than they know what to do with, so it is in their interest to be able to focus all that money as close to the event as possible. And rather than simply raising the pre-campaign spending limit they went one better: they removed it all together. They can spend unlimited amounts of money as close as 28 days before election day, 21 days before the opening of advance polls.

Let me repeat that: they can spend unlimited amounts of money.

So in British Columbia now, it is legal to both raise unlimited amounts of money from corporations, unions and individuals in any amounts at all (and some individuals and corporations have donated to the BC Liberals, individually, over $100,000 a year), and it is legal to spend unlimited amounts of money, right up to within 28 days of the election day.

See any problems with that?

GIS "Data Models"

Most IT professionals have some expectation, having received a basic education on relational data modelling, that a model for a medium sized problem might look like this:

Why is it, then, that production GIS data flows so consistently produce models that look like this:

What is wrong with us?!?? I bring up this rant only because I was just told that some users find the PostgreSQL 1600 column limit constraining since it makes it hard to import the Esri census data, which are "modelled" into tables that are presumably wider than they are long.

Saturday, March 21, 2015

Magical PostGIS

I did a new PostGIS talk for FOSS4G North America 2015, an exploration of some of the tidbits I've learned over the past six months about using PostgreSQL and PostGIS together to make "magic" (any sufficiently advanced technology...)


Friday, March 20, 2015

Making Lines from Points

Somehow I've gotten through 10 years of SQL without ever learning this construction, which I found while proof-reading a colleague's blog post and looked so unlikely that I had to test it before I believed it actually worked. Just goes to show, there's always something new to learn.

Suppose you have a GPS location table:

  • gps_id: integer
  • geom: geometry
  • gps_time: timestamp
  • gps_track_id: integer

You can get a correct set of lines from this collection of points with just this SQL:

  ST_MakeLine(geom ORDER BY gps_time ASC) AS geom 
FROM gps_poinst
GROUP BY gps_track_id

Those of you who already knew about placing ORDER BY within an aggregate function are going "duh", and the rest of you are, like me, going "whaaaaaa?"

Prior to this, I would solve this problem by ordering all the groups in a CTE or sub-query first, and only then pass them to the aggregate make-line function. This, is, so, much, nicer.

About Me

My Photo
Victoria, British Columbia, Canada

Blog Archive


bc (43) it (33) postgis (23) video (13) enterprise IT (11) icm (11) gis (9) sprint (9) open source (8) osgeo (8) enterprise (7) foi (7) cio (6) foippa (6) foss4g (6) management (6) spatial it (6) outsourcing (5) politics (5) mapserver (4) bcesis (3) boundless (3) opengeo (3) oracle (3) rant (3) COTS (2) architecture (2) deloitte (2) email (2) esri (2) hp (2) idm (2) javascript (2) natural resources (2) ogc (2) open data (2) openstudent (2) postgresql (2) technology (2) vendor (2) web (2) 1.4.0 (1) HR (1) access to information (1) accounting (1) agile (1) aspen (1) bcpoli (1) benchmark (1) buffer (1) build vs buy (1) business (1) business process (1) c (1) career (1) cartodb (1) cathedral (1) client (1) cloud (1) code (1) common sense (1) consulting (1) contracting (1) core review (1) crm (1) crockofshit (1) cunit (1) custom (1) data science (1) data warehouse (1) design (1) development (1) digital (1) environment (1) essentials (1) evil (1) exadata (1) fcuk (1) fgdb (1) fme (1) foocamp (1) foss4g2007 (1) ftp (1) gdal (1) gds (1) geocortex (1) geometry (1) geoserver (1) geotiff (1) google (1) google earth (1) government (1) grass (1) hadoop (1) iaas (1) icio (1) imagery (1) industry (1) innovation (1) integrated case management (1) introversion (1) iso (1) isss (1) isvalid (1) jpeg (1) jts (1) lawyers (1) mapping (1) mcfd (1) media (1) microsoft (1) money (1) mysql (1) new it (1) nosql (1) nrs transformation (1) oipc (1) opengis (1) openlayers (1) oss (1) paas (1) pirates (1) policy (1) portal (1) proprietary software (1) public accounts (1) qgis (1) r (1) rdbms (1) recursion (1) redistribution (1) regression (1) rfc (1) right to information (1) saas (1) salesforce (1) sardonic (1) scandal (1) seibel (1) sermon (1) server (1) siebel (1) snark (1) spatial (1) standards (1) statistics (1) svr (1) taxi (1) tempest (1) texas (1) tired (1) transit (1) tripledelete (1) twitter (1) uber (1) udig (1) uk (1) uk gds (1) verbal culture (1) victoria (1) waterfall (1) wfs (1) where (1) with recursive (1) wkb (1)