← Work Research platform Active

OpenMorph

A memory-over-energy AI/ML platform inspired by biological neural networks. Systems that learn from every interaction instead of being frozen after one training cycle. Digital organisms, not static models.

Context

Why it exists

The dominant AI story is scale: larger models, more data, more compute, more datacentres. Progress measured by training-run cost. The bill is paid in watts, water, attention and trust. OpenMorph is an attempt at a different answer: biological architectures that keep learning from every interaction, encode information in structure rather than in brute-force parameter counts, and stay small enough to run close to where they are used.

Shipping

What it ships today

  • A substrate for "digital organisms" — computational entities with persistent memory, multi-modal representations, and evolutionary lineage.
  • Storage layer combining vector search (LanceDB), graph structure (KuzuDB / DuckDB), and filesystem-native persistence.
  • FastAPI + gRPC surface for integration, with streaming evaluation and organism-level versioning.
Roadmap

What's next

Formalising the containment and lineage protocols (what happens when an organism evolves, when it interacts with external systems, when it is copied or forked). Benchmark work on sparse-activation inference.

Alignment

Principles this carries

— 02

Inside planetary limits

Earth has boundaries. Compute is not free. Attention is not free. Trust is not free. We design to work within limits, and we plan for regeneration rather than extraction.

— 03

Commons over commodification

Knowledge that shapes how life unfolds should belong to life — not to whichever entity reached a market first. We default to the commons, and we license like we mean it.

Full set at /why.

Working on something adjacent?

If OpenMorph resonates — as inspiration, as a collaboration, or as a problem you are solving in parallel — we would like to hear from you.

Start a conversation → Other projects