IBM / responsible-prompting-apiView on GitHub
Responsible Prompting is an LLM-agnostic tool that aims at dynamically supporting users in crafting prompts that embed responsible intentions and help avoid harmful, adversarial prompts.
45Jan 26, 2026Updated last month

Alternatives and similar repositories for responsible-prompting-api

Users that are interested in responsible-prompting-api are comparing it to the libraries listed below

Sorting:

Are these results useful?