Skip to content

Is it possible for the smoothed classifier to completely abstain on test set? #8

@kirk86

Description

@kirk86

@jmcohen Hi, thanks for releasing the code.
If you don't mind me asking, I'm trying to understand if its possible for a smooth classifier trained using randomised smoothing to completely abstain on the test set of cifar-10 corrupted with PGD l-infintiy norm?

I've trained a smooth classifier using noise=0.56 and at test time I use PGD with epsilon=0.1 and l-infinity norm to evaluate the robustness of the smooth classifier.

e.g. running one epoch on test set of cifar-10

for each batch in minibatches
    adversarial_samples = produce adv. noisy samples for this batch <-- PGD with l-infinity & epsilon=0.1
    for each x in the adversarial_samples
        # compute randomized smoothing labels
        predicted_labels = smooth_classifier.predict(x, n=10, alpha=0.001, batch_size=128)

Am I missing sth or is it completely normal in this case for the smoothed classifier to abstain from prediction for the whole test set on cifar10?

Thanks!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions