tianzhaotju / CODA

We propose a novel adversarial example generation technique (i.e., CODA) for testing deep code models. Its key idea is to use code differences between the target input and reference inputs to guide the generation of adversarial examples.
13Updated last year

Alternatives and similar repositories for CODA:

Users that are interested in CODA are comparing it to the libraries listed below