microsoft / deep-language-networks

We view Large Language Models as stochastic language layers in a network, where the learnable parameters are the natural language prompts at each layer. We stack two such layers, feeding the output of one layer to the next. We call the stacked architecture a Deep Language Network - DLN
93Updated 7 months ago

Alternatives and similar repositories for deep-language-networks:

Users that are interested in deep-language-networks are comparing it to the libraries listed below