Quantum measurement emerges from two separable mechanisms: flux partition into orthogonal channels yields Born rule weights, while redundant environments select which basis gets measured. No additional axioms needed. Just conservation, decoherence, and information theory working together.
The measurement problem haunts quantum mechanics: why do we see definite outcomes with Born rule probabilities in specific bases? Standard approaches either postulate the Born rule or assume preferred bases. This framework derives both from physical mechanisms: flux conservation determines weights, environmental redundancy determines basis. The result is testable, falsifiable, and mechanistic.
By separating these questions, we can solve each with well-understood physics rather than invoking new axioms.
Mechanism 1: Flux partition yields weights
Conservation of probability current \(\partial_t \rho + \nabla \cdot \mathbf{j} = 0\) means flux into orthogonal detection channels directly gives:
Born rule emerges from current conservation, not postulation.
The measurement basis isn't arbitrary. It's selected by environmental dynamics. We define a pointer functional that quantifies measurement quality:
The basis that maximizes \(P\) becomes the measurement basis. No assumption required.
We model a spin-1/2 particle in a Stern-Gerlach apparatus with controlled decoherence:
The environment consists of Gaussian collisions that scramble momentum while preserving local position information:
where \(U(q) = \exp(iq X)\) and \(q \sim \mathcal{N}(0, \sigma_q^2)\).
Scanning collision rate \(\lambda\), fragment number \(n_{\text{frag}}\), and evolution time \(T\), we find a sharp transition:
The critical collision rate follows a scaling law:
Concrete prediction: At \((\lambda, n_{\text{frag}}, T, \sigma_q) = (5, 30, 8, 1.5)\) with \(F = 0.6\), the functional selects position basis with distinguishability ratio \(D_{\text{pos}}/D_{\text{mom}} \approx 1.5\).
At the crossover point, measurement statistics confirm basis selection:
The position basis provides 50% better discrimination, exactly as the pointer functional predicts.
Decoherence explains why coherence disappears, but not which basis survives or why outcomes have specific probabilities. Our approach adds the missing pieces:
Three-layer explanation:
The framework makes concrete, testable predictions:
Several platforms can test these predictions:
If the pointer functional \(P\) fails to select the basis that maximizes redundancy and distinguishability in controlled experiments, the mechanism is wrong. This makes the theory genuinely falsifiable, unlike interpretational approaches.
The framework provides a concrete algorithm for predicting measurement outcomes:
This approach transforms measurement from mysterious collapse into mechanistic process:
Like other papers in this series, measurement emerges from flux conservation. Here, probability current rather than energy flux. The same mathematical structure that drives gravitational time evolution also governs quantum measurement dynamics.
Bottom line: Quantum measurement isn't mysterious. It's flux partition plus environmental redundancy. Born weights emerge from current conservation, measurement basis from information-theoretic optimization. The result is testable, falsifiable, and mechanistic. No additional axioms required.