I'm not going to go into a lot of explanation about why the following is important because, well, it isn't really that important except that there are a lot of pedantic twits running around obstructing a major advance in the philosophy of science by, like Theodoric of York, saying "NAHHH..." to the Algorithmic Information Criterion for causal model selection as the most principled information criterion we have in the natural sciences. Their "NAHHH..." takes the form of dismissing the Algorithmic Information Criterion on the grounds that its measure of information, Kolmogorov Complexity, depends on the choice of Turing Machine which, they claim, is "arbitrary". This means they think they can choose the instruction set of their Turing Machine such that a single instruction outputs the entire set of observations under consideration. Thus, their supposed reductio ad absurdum is to claim that the Kolmogorov Complexity of any given set of data may be measured as the bit length of a single instruction of their instruction set -- ideally one bit.

No, it isn't and no, they can't, and they know deep down that they're full of pedantic BS. But, ok, I'll play their pedantic game with them, not that they'll listen since, of course Theodoric of York wants to get back to his blood letting cures of his dying patients and has no time for such nonsense as "the scientific method":

It is known that a finite, directed graph of N-input NOR (henceforth NiNOR) gates can perform any computation that a Turing Machine with finite memory can. So...

**TLDR: Write a program, in a chosen instruction set, that simulates the directed graph of NiNOR gates that provides the chosen instruction set, including the memory that holds the program. Use that instruction set to write a program that, when executed by the instruction set simulation program, outputs the given dataset. The minimum sum of the length of these two programs is the NiNOR-Complexity.**

Let me expand just a little bit for those left wondering what I'm talking about:

The choice of Turing machine is not arbitrary any more than is the choice of axiomatic basis for arithmetic arbitrary. The natural sciences don't generally bother with "debunking" Ockham's Razor over debates regarding the choice of arithmetic's axioms. But, *sigh*, OK. If I need to spell this "philosophical nuisance" out in detail, let's descend a level of abstraction from computation to the more primitive concept of a NOR gate -- or more specifically at Directed Cyclic Graph of N-input NOR gates. Few would regard this as "arbitrary" since the choice of NAND is a mere isomorphic rotation in logic and is similarly "universal" -- and _minimal_.

It is known that a finite, directed graph of N-input NOR (henceforth NiNOR) gates can perform any computation that a Turing Machine with finite memory can. So...

Chose an emulation program, in a chosen instruction set, that simulates a chosen directed cyclic graph of NiNOR gates that provides the instruction set. This emulation program will have a parameter that is the size of the memory (ie: number of times to replicate the memory cells) such that it can hold the emulator. We'll permit it an integer that is not specified that is the amount of additional memory to contain the program to be executed by the emulator. Use that instruction set to chose a executable archive program of a given set of observations encoded as bits. The choices that minimize the size of emulation program and executable archive program is the NiNOR-Complexity. Now we still have a free parameter but is it a choice of Turing machine? No. It's an integer, the size of which goes up only as the log2 of the amount of memory required to expand the executable archive.

We're still left with the "uncomputability" issue but that's just one more philosophical nuisance since what it really means is that we can't prove that a given bit string is the "NiNOR Complexity" of a given dataset. Progress in the natural science is not held back by such concerns over proving that a given theory is the ultimate theory of nature either.