Software Versioning & The Digital Multiverse

Software Versioning & The Digital Multiverse
Brussels, Belgium

We're now going to delve into what I think is a natural extension of a previous post on sources of truth. The conversation so far has centred around general problems of decentralisation and reconciliation, but that's far from the end of the story! When I wrote about the supply chain I mentioned that digital production processes, machinery and their associated supply chains are dynamic and evolve over time, at least partially in response to what amounts to an ongoing discovery process of requirements.

The product of these evolutions is the release of new software versions that generally attempt to maintain the integrity of all previous promises in addition to making new ones. This is a broadly applicable model of a process that takes place anywhere software is produced: programming languages (as mentioned in mining for primitives), frameworks, and third-party solutions - both solutions that may be included in the user-facing software or the tools that application developers use to build software of their own.

What this can amount to in a practical sense for software developers is an environment where the ground is continuously shifting beneath your feet. Thanks to automatic updates, the tools that you used yesterday may not be the tools that you have available to you today unless you are able to engineer an environment where that cannot be the case. Any update to any system is potentially a path to an altering of the deal, in the Darth Vader sense.

When I wrote about ownership and access rights I mentioned the concept of handshake agreements, and how interactions between software applications with software platforms such as iOS involve an implicit expectation of shared protocol. Given that iOS is a software project managed by a large independent stakeholder, it may be changed on a timeline that is independent of any project that may interact with it. As I wrote there, these updates potentially introduce an altering of the deal, which generally is externalised toward application developers.

External parties aren't the only ones who alter the deal. When we release updates to our own software, sometimes we are altering the deal with our former selves. Revisiting the example from the last post about offline-first apps - reconciliation of states (or truths) generally relies upon an agreed upon protocol on the shape of the data. A high degree of similarity from the assessment of a human is not sufficient, in many scenarios digital systems demand exactness. There is no guarantee of how many versions apart a legacy offline state may be from a state of the art online state, either. Part of maintaining robust software is managing the integrity of data that may have been created at the time of the very first public launch and ensuring it is transformable into something recognised by the state of the art. Anything less than this is to risk exposing users to the "I put a sandwich into the VCR" scenario I raised in ownership and access rights.

To complicate this further, many contemporary software projects have a division of labour, such as between a mobile app and the cloud. To maintain compatibility over many versions while still facilitating the addition of new features, often what happens in practice is that responses from the cloud vary depending on the version of the mobile client that is interacting with them. This results in a situation where in the mobile space, both the mobile apps and the cloud computing capability that supports them should account for a range of possible partners to perform handshake agreements with. This is just to service "what is generally expected", and before we consider that we may have features in general preview and/or we may be performing additional experiments based on segments of users.

In practice there are other considerations to the digital multiverse, stories for another time. Until then.