Category Theory — Brief ☧
"In whom are hid all the treasures of wisdom and knowledge."
— Colossians 2:3 (KJV)
Deep version → | Back to Lenses →
Q: Think about a family tree. You have people (the "things") and
parent-child relationships between them (the "connections"). Now think
about a city map: intersections are the things, roads are the connections.
These look completely different on the surface, but they share a pattern —
a collection of things and the ways they relate to each other. Is there a
single framework that captures this pattern?
A: That is exactly what category theory provides. A category has
three ingredients: objects (the things), morphisms or arrows
(the connections between things), and a rule for composition (if
there is an arrow from A to B and another from B to C, there must be
one from A to C). It is the study of structure and relationship at the
most general level — a kind of "mathematics of patterns."
Think of it like a graph — nodes and edges — but with
the extra requirement that edges compose and every node has a self-loop
(an identity arrow that says "I connect to myself").
Objects: Father ──────→ Son ──────→ Spirit Morphisms: "sends" "sends" Composition: Father ─────────────────→ Spirit "sends" Identity: Father ──→ Father (self-relation)Q: That is beautiful as an abstract idea, but why would a programmer
or engineer care about it?
A: Because a functor — a structure-preserving map from one
category to another — is precisely what a compiler does. It translates
from one language to another while keeping the meaning intact. If you
have ever mapped a function over a list, you have used a functor.
Our pipeline — miniKanren to tensor operations to FPGA gates — is a
chain of functors, each faithfully translating structure from one domain
to the next.
Q: So category theory is really about translation and composition?
A: At its heart, yes. It tells you when two different-looking systems
have the same underlying structure, and it gives you tools to move
between them without losing information. In programming, this shows up
everywhere: type constructors like
MaybeandListare functors;the way you chain operations in iteration patterns
often follows monadic structure; even hash tables
can be viewed through this lens.
Core Concepts
| Concept | Meaning | Our Project |
|---|---|---|
| Category | Objects + morphisms + composition | Types + functions |
| Functor | Map between categories preserving structure | Maybe, List, IO in Haskell; our compiler pipeline |
| Natural transformation | Map between functors | maybeToList :: Maybe a -> [a] |
| Adjunction | Optimal pair of functors | Free ⊣ Forgetful |
| Monad | Endofunctor + join + return | IO, State, nondeterminism (conde) |
This table condenses an enormous amount of mathematical machinery into five rows, so let us make sure the key ideas land. A category is the most general way to talk about "things with connections that compose." A functor is a translation between two categories that respects those connections -- it never breaks the structure. A natural transformation is a systematic way to convert one translation into another. Adjunctions and monads build on these foundations to capture patterns like "the optimal pair of translations" and "computations that can be sequenced." If you remember nothing else, remember this: category theory is the science of composition, and composition is the most important idea in all of computer science.
Connection to our system: Our relational tensors are functors from index
categories to Bool -- each tensor maps a tuple of variable indices to a truth value (does this combination satisfy the relation?). The compilation from miniKanren to FPGA gates is a functor
that preserves the compositional structure of search: if two miniKanren operations compose to produce a result, the corresponding FPGA gate operations must compose to produce the same result. This is not a metaphor -- it is a mathematical guarantee. Each level of translation
is guaranteed to preserve the algorithm's meaning, and category theory gives us the language and tools to prove it. When we say "the compiler is a functor," we are saying that every logical relationship in the source language is faithfully reproduced in the target language, and that this faithfulness is preserved under composition.
Soli Deo Gloria