Research · Per Ardua

Full Mind Transfer: Bandwidth vs Fidelity in Activation-Level Coordination

Text and activations carry fundamentally different information

AI-17 Activation Geometry DOI

Executive Summary

Paper XV showed that INLP projections outperform full activations for domain classification, suggesting that lower-bandwidth transmission can sometimes beat higher-bandwidth. This paper systematically explores the bandwidth-fidelity tradeoff by comparing text-only, activation-only, and combined transmission at multiple PCA dimensionalities.

The central finding is that text and activations carry fundamentally different kinds of information. All text conditions — regardless of verbosity or structure — cluster at RSA similarity ~0.11, while BOS + full activations achieves RSA 0.47, a 4x improvement in geometric fidelity. Yet despite this massive geometric advantage, KL divergence remains similar across conditions (6.97-7.35 nats), meaning the activation-level similarity does not translate into more similar output distributions.

Perhaps most striking is the domain reversal pattern: text-based coordination preserves legal reasoning best (RSA 0.23 for legal vs 0.08-0.12 for other domains), while activation-based coordination preserves scientific reasoning best (RSA 0.64 for science vs 0.30-0.45 for other domains). This suggests that different domains are encoded at different levels of abstraction — legal reasoning lives more in the symbolic/linguistic layer, while scientific reasoning lives more in the geometric/activation layer. A bandwidth ceiling emerges at approximately 100 PCA dimensions, beyond which additional activation dimensions provide no benefit.

Key Findings

  • Fundamental information type distinction: All text conditions cluster at RSA ~0.11; BOS + full activations achieves RSA 0.47 (4x higher geometric fidelity)
  • Geometric fidelity does not transfer: Despite 4x RSA advantage, KL divergence remains similar (6.97-7.35 nats) across all conditions
  • Domain reversal: Text preserves legal best (RSA 0.23), activations preserve science best (RSA 0.64) — different domains live at different abstraction layers
  • Bandwidth ceiling: Returns diminish beyond ~100 PCA dimensions; additional activation bandwidth provides no benefit

Key References

  • McEntire (2026) — INLP Projection Transmission: denoising effect in low-dimensional transfer (Paper XV)
  • McEntire (2026) — Sender Continuation Perplexity: reasoning trajectory alignment (Paper XVII)
  • McEntire (2026) — The Inter-Instance Compression Barrier: uniform lossy channel (Paper XIV)
  • Kriegeskorte et al. (2008) — Representational similarity analysis

Download Full Paper

Access the complete research paper with detailed methodology, empirical evidence, and formal proofs.

Download PDF