tianzhaotju / CODA

We propose a novel adversarial example generation technique (i.e., CODA) for testing deep code models. Its key idea is to use code differences between the target input and reference inputs to guide the generation of adversarial examples.
13Updated last year

Related projects

Alternatives and complementary repositories for CODA