Latent-to-Parameter Transfer: High-Bandwidth Logic Injection via Neural-State Conditioned Hypernetworks

Published in NeurIPS 2026 (Planned Submission), 2025

Keywords: Knowledge Distillation, Hypernetworks, LoRA, Reasoning Models, Parameter-Efficient Transfer

This work proposes Latent-to-Parameter Transfer (LPT), a framework that leverages internal latent representations of a frozen teacher model to generate task-specific low-rank parameter updates for a student model. Unlike text-conditioned adapter generation, LPT exploits the high-dimensional geometry of the teacher’s reasoning process, enabling more faithful logic transfer.

Code and preprint will be released upon submission.