In a post AI landscape, provenance and veracity are paramount. When every entity knows the same information - provenance & truth are hard to find - and security becomes crucial. Hacks and forgeries grow, and the community pays the price, while drowning in the ensuing noise. This lowers trust and leads to inefficient markets, more uncertainty, & lowers unit economics.
A new problem needs a new solution, and thus - the inception of a new technology to address it.
This new tech - Veracity - solves the provenance problem on an entirely new layer, pre consensus. The protocol will benefit its adopters for intent beyond just economics - integrity, provenance, assurance - veracity.
These truths raise the trust level, & in turn, this leads to markets where unit economics will trend up, and not down.
The underlying values and physics are simple at their core, and rewards the adopters of these principles aligned with these truths. For example, it’s the 1st protocol to introduce pre-consensus security messaging, and with a decaying value over time. This is how the protocol defines provenance.
In physics, there is a clean and ancient distinction between two types of questions. The first: in what order did these events occur? The second: what was true at the moment each event happened? Relativity made the first question famously hard — simultaneity is frame-dependent, ordering is observer-dependent at sufficient velocities. But the second question is, in a deep sense, more fundamental. The state of a system at a given moment is a physical fact. It exists independently of any observer’s ability to sequence it relative to other events.
Blockchain research has spent a decade on the first question. Ordering — sequencing, inclusion, fairness, latency — is the central obsession of pre-consensus design. PBS, SUAVE, preconfirmations, inclusion lists: all of these are sophisticated answers to when and how does a message get committed, relative to other messages?
Veracity asks the second question. Not when was this message ordered, but what was true when this message was born?
The analogy to thermodynamics is precise. A gas in a sealed container has a temperature, a pressure, a volume — a complete state description at every moment in time. If you open a valve, molecules escape and the state changes irreversibly. You cannot recover the original state from the post-valve measurement alone. You need a snapshot taken before the valve opened — a record of the system’s state at t₀, before entropy took over.
In the pre-consensus membrane, data is that gas. An oracle reading, a user intent, a signed attestation — each represents a snapshot of world-state at a specific moment. The moment it enters the pipeline, it begins to interact with other messages, gets reordered, delayed, potentially replayed. By the time it is committed on-chain, the original snapshot has been through a thermodynamic blender. The ordering layer has no memory of what the world looked like when that datum was born.
Veracity is the sealed container. The WAL receipt, written at 12µs signing latency, captures the state snapshot at t₀:
the state snapshot at t₀: r = BLAKE3(σ‖t₀‖m‖r_prev) V(t) = V₀ · e^(−λ(t−t₀)), λ = (ln 2) / τ₁/₂
This is not a metaphor. Exponential decay is the natural description of any quantity whose rate of change is proportional to its current value. For a signed message, the “quantity” is actionable trust — the degree to which a downstream consumer should treat the claim as a current reflection of world-state. Trust decays because the world moves on. The message does not.
What ordering research cannot do — and has never claimed to do — is recover the pre-valve state. It works with whatever arrives at the builder. Veracity works upstream of that. It seals the container at the moment of truth, stamps it with a tamper-evident clock, and hands the ordering layer a message that carries its own forensic history. Ordering tells you where a message ended up. Veracity tells you what it was worth when it set out.