Skip to content

Commit 44e1ca8

Browse files
committed
Fix MNISTAutoencoder: switch output layer to LEAKYRELU
SOFTMAX default does not work with autoencoders. Signed-off-by: Marc Strapetz <[email protected]>
1 parent 9e971fd commit 44e1ca8

File tree

1 file changed

+1
-0
lines changed
  • dl4j-examples/src/main/java/org/deeplearning4j/examples/quickstart/modeling/feedforward/unsupervised

1 file changed

+1
-0
lines changed

dl4j-examples/src/main/java/org/deeplearning4j/examples/quickstart/modeling/feedforward/unsupervised/MNISTAutoencoder.java

+1
Original file line numberDiff line numberDiff line change
@@ -74,6 +74,7 @@ public static void main(String[] args) throws Exception {
7474
.layer(new DenseLayer.Builder().nIn(10).nOut(250)
7575
.build())
7676
.layer(new OutputLayer.Builder().nIn(250).nOut(784)
77+
.activation(Activation.LEAKYRELU)
7778
.lossFunction(LossFunctions.LossFunction.MSE)
7879
.build())
7980
.build();

0 commit comments

Comments
 (0)