I recently came across this Taxonomy of Tech Debt write-up. Written a few years ago, it presents 3 major axes along which to evaluate technical debt: impact, cost to fix, and contagion. The latter - the idea of tech debt that “infects” its surrounding landscape - is the most interesting to me. I’ve seen this brand of tech debt before, but it was a bit of an “oh yeah!” moment in that I’ve never mentally framed it in quite this way. The article also got me thinking about what some of the larger bits of tech debt are, ones that index highly on all three (and particularly on contagion).

The first thing that came to mind was the 32-bit integer data type, which is almost certainly a 5 in all three categories. I’ve covered this here multiple times before (_inGraph of the Week - The Number(s) of the Beast inGraph of the Week - Y2K22, _). Integer overflows/underflows are a constant source of severe bugs and permeate more or less All of Everything that has anything to do with computers.

Total Tangent: I sometimes get caught up in wondering what the total number of CPU cycles spent on null checks is. Like…what percentage of the carbon footprint generated by compute is strictly from checking “Is this number zero?”. I’d probably conservatively (and, admittedly, ignorantly - I have no data on this) put it at 20-40% of all CPU cycles, depending on whether you include things like equality checks getting translated into the machine instructions for “subtract these two registers and branch if the result is equal to zero”.

Anyhow. 32-bit integers as “Root Cause” is kind of interesting…but also kind of not. It’s (perhaps ironically) almost too fundamental. It’s such a fundamental data type and so pervasive that talking about solving these problems globally is effectively a discussion around changing physical realities, so the “solutions” - broadly-speaking - tend toward “just use more bits going forward.”

A piece of societal tech debt that I think is perhaps a bit more interesting: Daylight Savings Time. Handling time with computers is horrendously difficult to get right, and DST isn’t doing anyone any favors. In terms of contagion, DST anything having to do with manipulating timestamps - which isn’t “everything”…but it’s an awful lot of things. Imagine you live in Hawaii (which does not observe DST), and you’re scheduling a recurring meeting with someone in New York City (which does observe DST, starting on March 13) and someone else in Dublin, Ireland (which also observes DST…but not starting until March 27). On, say, the Ides of March…what time is the meeting actually supposed to be?

And so, we have libraries for manipulating dates and times that have a million caveats and provisos and implementation quirks like “Naive [date and time] objects are easy to understand and to work with, at the cost of ignoring some aspects of reality.” So as a programmer I’ve got to worry about whether my datetime object is “woke” enough to know what time it is in Denver right now? Lordy.

I think this sentence probably sums it up best: “The rules for time adjustment across the world are more political than rational, change frequently, and there is no standard suitable for every application aside from UTC.” Re-read that last bit, and if you’re ever in a position to have to manipulate dates/times with a computer do yourself a favor: do everything in UTC to the extent possible, perhaps only converting it to a local timestamp for display to human beings if you must.