Skip to content

Commit 386a2f9

Browse files
Added learning rate scaling. Thanks @marcofraccaro
1 parent 199733b commit 386a2f9

File tree

2 files changed

+2
-1
lines changed

2 files changed

+2
-1
lines changed

NN/nnsetup.m

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,7 @@
99
nn.activation_function = 'tanh_opt'; % Activation functions of hidden layers: 'sigm' (sigmoid) or 'tanh_opt' (optimal tanh).
1010
nn.learningRate = 2; % learning rate Note: typically needs to be lower when using 'sigm' activation function and non-normalized inputs.
1111
nn.momentum = 0.5; % Momentum
12+
nn.scaling_learningRate = 1; % Scaling factor for the learning rate (each epoch)
1213
nn.weightPenaltyL2 = 0; % L2 regularization
1314
nn.nonSparsityPenalty = 0; % Non sparsity penalty
1415
nn.sparsityTarget = 0.05; % Sparsity target

NN/nntrain.m

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -69,7 +69,7 @@
6969
end
7070

7171
disp(['epoch ' num2str(i) '/' num2str(opts.numepochs) '. Took ' num2str(t) ' seconds' '. Mean squared error on training set is ' num2str(mean(L((n-numbatches):(n-1))))]);
72-
72+
nn.learningRate = nn.learningRate * nn.scaling_learningRate;
7373
end
7474
end
7575

0 commit comments

Comments
 (0)