Open Vancouver

Via Geoff Zeiss, those granola sucking hippies in the City of Vancouver council have decided to to promote open data, open standards, and … open source?!?

I always get a little worried when politicians start actively mucking around in the operational details of government, but to be fair it is hard to put a hard black and white line between where policy stops and implementation begins. Is mandating open source a policy, or is it telling city workers how to do their jobs? Council could also pass a resolution not allowing the “q” key to be pressed on Fridays.

In this great debate, I think my position is this: mandating open data (basically an enhanced form of Freedom of Information) and open standards (without which Freedom of Information is moot) make sense as policy matters. Mandating open source crosses the line, dealing more with “means” than with “ends”. City managers will eventually figure out on their own that the value proposition of considering open source is high. Council might be (hell, they are) right on this issue, but defining what tools are to be used to achieve operational goals is a slippery slope, and there’s lots of places council will be wrong in the future on if they go down this path.

For a great example of why you don’t want legislators telling the executive precisely what to do, see congressional earmarks.

iPhone w/ Magnetometer

Magnetometer + GPS + accelerometer == device that knows exactly where it is and where it is pointed. So, augmented reality, here we come.

Nice ship. Be a real shame if something happened to it.

If terrorists can use Google Earth to bring down Western civilization, I’m sure pirates can find something useful to do with this.

Update: Title Reference

Magnum Opus

I recently (well, two months ago) and wrote down a great deal of thoughts about the architecture for web mapping we are pushing at OpenGeo, and it’s up on the website now.

http://opengeo.org/publications/opengeo-architecture/

Architecture of Evil

Update: I think the magnitude of the evil can only be appreciated if you see the JSP (yep, that’s all of it, that’s my “middleware”):

<%@ taglib uri="http://java.sun.com/jsp/jstl/sql" prefix="sql" %>
<%@ taglib uri="http://java.sun.com/jsp/jstl/core" prefix="c" %>
<%@ page contentType="text/x-json" %>

<sql:query var="rs" dataSource="jdbc/postgisdb">
${param.sql}
</sql:query>

{"type":"FeatureCollection",
 "features":[
<c:forEach var="row" items="${rs.rows}">
 {"type":"Feature",
  "geometry":<c:out value="${row.st_asgeojson}" escapeXml="false" />,
  "properties":{
  <c:forEach var="column" items="${row}">
   <c:if test="${column.key != 'st_asgeojson'}">
    "<c:out value="${column.key}" escapeXml="false" />":
    "<c:out value="${column.value}" escapeXml="false" />",
   </c:if>
  </c:forEach>
}},
</c:forEach>
]}

Update 2: Yes, I am being a bit sarcastic. Being able to compress the layer between the Javascript and the database into something this narrow is diabolical, and only possible because there is so much smarts in OpenLayers. I, for one, welcome our new hipster Javascript overlords.

Update 3: The “evil” is passing SQL unmediated from your browser directly into your database. It’s fun in a workshop (which is what I wrote this abomination for) but it’s not to be let out of the lab, lest global pandemic ensue.