For all those little papers scattered across your desk

In this episode, the intcode computer is born.

Part 1 consisted of implementing a simulator capable of evaluating programs described in intcode. Intcode is a list of comma-separated integers, each of which may be an instruction or address at any given moment. Later challenges add even more items to that list.

I was initially going to split this up into nice structures and functors (a sort of meta-programming technique where a structure is parameterized by types, values, and structures), but I struggled do so.

In the end, memory and the interpreter all lived in the same structure. The only operations to support were arithmetic, so it wasn’t an issue at the time.

I am proud that I recognized this as a state-machine problem, and found a way to encode every computation for an intcode program in a state-machine complete with transition function (more on that in future episodes ).

For a good overview of state-machines and other topics, I recommend *Elements of
the Theory of Computation*, by Lewis and Papadimitriou.

I’m still working out all the ramifications, but I believe that the intcode
computer (especially with later-day modifications) is Turing complete; at the
very least, with a few modifications of today’s architecture, the halting
problem for intcode becomes undecidable. So I believe this means we do not have
a *finite* state-machine, but rather a *possibly-infinite* state-machine.

Briefly, a state-machine is the tuple where is the arbitrarily-complex set of states and is the transition function. Note that a state may be a simple object, such as a single symbol, or a complex object, like a tuple of a memory object, an instruction pointer, and a few machine registers.

Given this machine, one feeds an input state through repeated applications of . So, for example, the -th iteration of the machine is the value . When we reach a special state that we designate the “halting state,” the machine halts. (We can refine this to include any state in the set of halting states , if we wish—and in fact, later intcode simulators do).

We can now model an interpreter for this state-machine as as the closures of
—call it —which is the application of until
we reach a halting state. Alternately, if we make each halting state a
*fix-point* (if , then is a fix-point of ), then
is still the closure of , though now until we reach the
fix-point.

I am least proud of the fact that this version used exceptions. Ick.

Part 2 mainly consisted of searching a (relatively small) search-space for values which produced a certain target, so I implemented that outside of all the intcode structures. In fact, I don’t recall making any substantial changes to the code here.

This commit was the *slight* refactoring to use more standard naming
conventions. SML programmers typically prefer lower case, then `camelCase`

, for
types and identifiers. Structures are the special guys that get `TitleCase`

,
while functors require `Fn`

at the end and signatures are in `SHOUTYCAPS`

.

This very short commit was another refactor: semantically, I tend to associate compilation with source-transformation. All we do with an intcode program is load it into a runnable state, so no transforming done there.

While we’re discussing these kinds of programs (compilers and interpreters), I should mention that I’m probably going to write an intcode assembler by the time this is all said and done. It’s not too difficult. Going the other way is impossible without executing the code, though, as an instruction might later be read and modified by a different instruction. I could even support labels… hm .

Load Comments