BenChaliah / Superposition-TransformerView on GitHub
a novel architecture that leverages Autoencoders to superimpose the hidden representations of a base model and a fine-tuned model within a shared parameter space. Using B-spline-based blending coefficients and autoencoders that adaptively reconstruct the original hidden states based on the input data distribution.
80Aug 1, 2025Updated 7 months ago

Alternatives and similar repositories for Superposition-Transformer

Users that are interested in Superposition-Transformer are comparing it to the libraries listed below

Sorting:

Are these results useful?