You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I tried to analyze a batch of inputs with the 'smoothgrad' method and neuron_selection_mode = "index" with your package iNNvestigate (version 1.0.9). Surprisingly, only the output for the first input in the batch made any sense, because the gradients for all other values were set to 0. I think there's something wrong with your code.
Here is a small example:
I generated a classification dataset with sklearn with 5 input features and 2 classes. On this dataset, I trained a keras model with 3 dense layers. Then I removed the softmax activation and created the analyzers for the methods 'gradient' and 'smoothgrad'. I also set the noise scale (noise_scale = 1e-8) to a very small value for the smoothgrad method, which makes the results almost identical to the normally calculated gradients. Below you can find the code for my example:
importnumpyasnpnp.random.seed(1234)
fromsklearn.datasetsimportmake_classificationimportkeras# version 2.2.4importinnvestigate# version 1.0.9importinnvestigate.utilsasiutilsn_inputs=5n_classes=2n_samples=512# Generate classification datadata_x, data_y=make_classification(n_samples=n_samples, n_features=n_inputs, n_classes=n_classes)
data_y=keras.utils.to_categorical(data_y)
# Define keras model and fit itmodel=keras.models.Sequential([
keras.layers.Dense(16, activation="relu", input_shape=(n_inputs,)),
keras.layers.Dense(8, activation="relu"),
keras.layers.Dense(n_classes, activation="softmax"),
])
model.compile(loss="categorical_crossentropy", optimizer="adam", metrics=["accuracy"])
model.fit(data_x, data_y, epochs=25, batch_size=128)
# Start the analysis with iNNvestigatemodel=iutils.keras.graph.model_wo_softmax(model)
# Set the noise level sufficiently low such that almost identical values should come out for the methods# 'gradient' and 'smoothgrad'noise_scale=1e-8# Define the analyzer for each methodanalyzer_smoothgrad=innvestigate.create_analyzer("smoothgrad",
model,
noise_scale=noise_scale,
neuron_selection_mode="index")
analyzer_gradient=innvestigate.create_analyzer("gradient",
model,
neuron_selection_mode="index")
# One input for output neuron '0'result_smoothgrad=analyzer_smoothgrad.analyze(data_x[1:2,:], 0)
result_gradient=analyzer_gradient.analyze(data_x[1:2,:], 0)
print("Mean squared error (one input): {:4f}".format(np.mean((result_gradient-result_smoothgrad)**2)))
# Multiple inputs for output neuron '0'result_smoothgrad=analyzer_smoothgrad.analyze(data_x[1:3,:], 0)
result_gradient=analyzer_gradient.analyze(data_x[1:3,:], 0)
print("Mean squared error (multiple inputs): {:4f}".format(np.mean((result_gradient-result_smoothgrad)**2)))
print("Result smoothgrad:")
print(result_smoothgrad)
print("Result gradient:")
print(result_gradient)
Here is the output from the example and as you can see the output of 'smoothgrad' for the second input makes no sense:
Hello!
I tried to analyze a batch of inputs with the 'smoothgrad' method and
neuron_selection_mode = "index"
with your package iNNvestigate (version 1.0.9). Surprisingly, only the output for the first input in the batch made any sense, because the gradients for all other values were set to 0. I think there's something wrong with your code.Here is a small example:
I generated a classification dataset with
sklearn
with 5 input features and 2 classes. On this dataset, I trained a keras model with 3 dense layers. Then I removed thesoftmax
activation and created the analyzers for the methods 'gradient' and 'smoothgrad'. I also set the noise scale (noise_scale = 1e-8
) to a very small value for thesmoothgrad
method, which makes the results almost identical to the normally calculated gradients. Below you can find the code for my example:Here is the output from the example and as you can see the output of 'smoothgrad' for the second input makes no sense:
I hope I didn't make a mistake and that I could help you to improve this wonderful package.
Best regards,
Niklas
The text was updated successfully, but these errors were encountered: