Selu - 1 vs 6¶
- Selu1 → Selu6 +4 -4
Selu1 → Selu6
RENAMED
@@ -1 +1 @@
|
|
1
1
|
Selu takes one input data (Tensor<T>) and produces one output data
|
2
2
|
(Tensor<T>) where the scaled exponential linear unit function,
|
3
3
|
y = gamma * (alpha * e^x - alpha) for x <= 0, y = gamma * x for x > 0,
|
4
4
|
is applied to the tensor elementwise.
|
5
5
|
**Attributes**
|
6
6
|
* **alpha**:
|
7
|
-
Coefficient of SELU default to 1.
|
7
|
+
Coefficient of SELU default to 1.67326319217681884765625 (i.e.,
|
8
|
+
float32 approximation of 1.6732632423543772848170429916717).
|
8
|
-
* **consumed_inputs**:
|
9
|
-
legacy optimization attribute.
|
10
9
|
* **gamma**:
|
11
|
-
Coefficient of SELU default to 1.
|
10
|
+
Coefficient of SELU default to 1.05070102214813232421875 (i.e.,
|
11
|
+
float32 approximation of 1.0507009873554804934193349852946).
|
12
12
|
**Inputs**
|
13
13
|
* **X** (heterogeneous) - **T**:
|
14
14
|
Input tensor
|
15
15
|
**Outputs**
|
16
16
|
* **Y** (heterogeneous) - **T**:
|
17
17
|
Output tensor
|
18
18
|
**Type Constraints**
|
19
19
|
* **T** in (
|
20
20
|
tensor(double),
|
21
21
|
tensor(float),
|
22
22
|
tensor(float16)
|
23
23
|
):
|
24
24
|
Constrain input and output types to float tensors.
|