Executive Summary
Every communication channel with preference divergence between sender and receiver is endogenously lossy — not as a defect, but as a structural inevitability. This paper formalizes the generative lossy channel, proving that any lossy reconstruction that maintains internal coherence must diverge from its source. The divergence is not noise in the classical sense; it is a "generative residual" that contains structure not present in the original signal. Whether this residual produces dysfunction or novelty depends entirely on the selection regime applied to the channel's output.
Under convergent selection — where outputs are evaluated against a fixed standard of correctness — the generative residual manifests as systematic distortion: sycophancy in AI systems, groupthink in organizations, cargo-cult methodology in academia. Under divergent selection — where outputs are evaluated for fitness in novel contexts — the same mechanism produces creative recombination: jazz improvisation, scientific serendipity, productive organizational deviance. The channel is identical; the selection regime determines the sign of its output.
The theory is substrate-independent, applying identically to human cognition, organizational communication, AI alignment, and academic knowledge production. Five sufficient conditions for net-beneficial noise are formalized, and a ratchet mechanism (Proposition 2.5) explains why lossy channels under convergent selection degrade monotonically. Validation includes 500 Monte Carlo configurations and external evidence from Berliner and Pressing on jazz improvisation, Vaughan on the Challenger disaster's normalization of deviance, and Perez and Sharma on sycophancy in language models.
Key Contributions and Methodology
The paper's central formal contribution is the identification of the generative residual as a necessary consequence of lossy reconstruction under coherence constraints. Standard information theory treats reconstruction error as noise — unstructured deviation from the source signal. The generative lossy channel framework shows that when the reconstruction must maintain internal coherence (grammaticality in language, logical consistency in reasoning, social plausibility in organizations), the "error" acquires structure. This structured divergence is the generative residual.
The ratchet mechanism (Proposition 2.5) formalizes why convergent selection on lossy channels produces monotonic degradation. Each round of lossy transmission under convergent evaluation biases the next round's input distribution toward the evaluator's preferences. Over iterations, the channel's output converges not toward truth but toward the evaluator's prior — an information-theoretic formalization of sycophancy, groupthink, and methodological monoculture.
Validation proceeds on three fronts. First, 500 Monte Carlo configurations spanning the full parameter space of the five sufficient conditions confirm the theoretical predictions about convergent versus divergent selection regimes. Second, the jazz improvisation literature (Berliner, Pressing) provides a well-documented case of divergent selection producing beneficial generative residuals. Third, the Challenger disaster (Vaughan) and AI sycophancy (Perez, Sharma) provide cases of convergent selection producing harmful residuals. The substrate-independence claim is validated by showing that the same formal mechanism explains all three domains.
Key Findings
- Endogenous lossiness: Any channel with preference divergence between sender and receiver is necessarily lossy — lossiness is structural, not a design flaw
- Generative residual: Lossy reconstruction under coherence constraints produces structured divergence, not random noise — the residual contains novel information
- Selection regime determines sign: The same generative mechanism produces dysfunction (convergent selection) or novelty (divergent selection)
- Ratchet mechanism: Convergent selection on lossy channels produces monotonic degradation toward evaluator priors (Proposition 2.5)
- Substrate-independence: The framework applies identically across cognition, organizations, AI systems, and academic knowledge production
- Five sufficient conditions: Formalized and validated across 500 Monte Carlo configurations for net-beneficial noise
Key References
A Mathematical Theory of Communication. Bell System Technical Journal, 27(3), 379-423.
Thinking in Jazz: The Infinite Art of Improvisation. University of Chicago Press.
Improvisation: Methods and Models. In J. Sloboda (Ed.), Generative Processes in Music. Oxford University Press.
The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA. University of Chicago Press.
Towards Understanding Sycophancy in Language Models. arXiv preprint.