BenChaliah / Superposition-Transformer

a novel architecture that leverages Autoencoders to superimpose the hidden representations of a base model and a fine-tuned model within a shared parameter space. Using B-spline-based blending coefficients and autoencoders that adaptively reconstruct the original hidden states based on the input data distribution.
42Updated last week

Alternatives and similar repositories for Superposition-Transformer:

Users that are interested in Superposition-Transformer are comparing it to the libraries listed below