This is the first entry. Not a manifesto, not a roadmap. Just a marker that this space exists now.

The name — Still Decoding — is the point. Understanding is a process, not a destination. Most of what gets published online pretends otherwise: clean conclusions, confident takes, neat frameworks. That's fine, but it's not what happens inside your head when you're actually learning something.

This blog is an attempt at something different. Expect:

  • Essays that start with a question and don't always land on an answer
  • Technical breakdowns of things I'm figuring out
  • Occasional detours into philosophy, because the interesting questions tend to live at the edges

If you're here, you're probably still decoding too.


Everything below is here because I'm testing the template.

A goat

Here's a goat, because like most developers I secretly dream of becoming a goat farmer.

A goat kid in capeweed

An equation

Boltzmann's entropy formula:

$$S = k_B \ln W$$

Where:

  • $S$ is the entropy of the system
  • $k_B$ is the Boltzmann constant
  • $W$ is the number of microstates — the number of ways the microscopic parts can be arranged while still producing the same macroscopic state

Three letters and a constant, and it bridges the microscopic and the macroscopic. It says that entropy is just a measure of how many ways the underlying parts can be arranged. The more arrangements, the higher the entropy. It's carved on Boltzmann's tombstone.

Shannon borrowed this idea almost a century later to define information entropy — the uncertainty in a message. The formula looks nearly identical, and that's not a coincidence. Whether you're counting molecular configurations or bits in a signal, you're measuring the same thing: how much you don't know. Large language models take this even further — they're trained by minimising cross-entropy, essentially learning to be better predictors of the next token. Shannon showed that compression and prediction are the same thing. Turns out, if you get good enough at prediction, something that looks like understanding starts to emerge.