Skip to content

Commit 316a0cf

Browse files
committed
Update doc for RReLU
1 parent e58394a commit 316a0cf

File tree

1 file changed

+1
-0
lines changed

1 file changed

+1
-0
lines changed

doc/transfer.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -290,6 +290,7 @@ m=nn.ReLU(
290290
l, -- minimum factor for negative inputs, default: 1/8;
291291
u, -- maximum factor for negative inputs, default: 1/3;
292292
inplace -- if true the result will be written to the input tensor, default: false;
293+
cw -- if true all elements of the same channel share the same `a`, default: false;
293294
)
294295
```
295296
If `l == u` a RReLU effectively becomes a LeakyReLU. Regardless of operating in in-place mode a RReLU will internally allocate an input-sized `noise` tensor to store random factors for negative inputs. The backward() operation assumes that forward() has been called before.

0 commit comments

Comments
 (0)