You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: doc/transfer.md
+1Lines changed: 1 addition & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -290,6 +290,7 @@ m=nn.ReLU(
290
290
l, -- minimum factor for negative inputs, default: 1/8;
291
291
u, -- maximum factor for negative inputs, default: 1/3;
292
292
inplace-- if true the result will be written to the input tensor, default: false;
293
+
cw-- if true all elements of the same channel share the same `a`, default: false;
293
294
)
294
295
```
295
296
If `l == u` a RReLU effectively becomes a LeakyReLU. Regardless of operating in in-place mode a RReLU will internally allocate an input-sized `noise` tensor to store random factors for negative inputs. The backward() operation assumes that forward() has been called before.
0 commit comments