Selu - 1 vs 6#

Next section compares an older to a newer version of the same operator after both definition are converted into markdown text. Green means an addition to the newer version, red means a deletion. Anything else is unchanged.

Files changed (1) hide show
  1. Selu1 → Selu6 +4 -4
Selu1 → Selu6 RENAMED
@@ -1 +1 @@
1
1
  Selu takes one input data (Tensor<T>) and produces one output data
2
2
  (Tensor<T>) where the scaled exponential linear unit function,
3
3
  y = gamma * (alpha * e^x - alpha) for x <= 0, y = gamma * x for x > 0,
4
4
  is applied to the tensor elementwise.
5
5
  **Attributes**
6
6
  * **alpha**:
7
- Coefficient of SELU default to 1.67326319217681884765625 (i.e.,
7
+ Coefficient of SELU default to 1.6732.
8
+ * **consumed_inputs**:
8
- float32 approximation of 1.6732632423543772848170429916717).
9
+ legacy optimization attribute.
9
10
  * **gamma**:
10
- Coefficient of SELU default to 1.05070102214813232421875 (i.e.,
11
+ Coefficient of SELU default to 1.0507.
11
- float32 approximation of 1.0507009873554804934193349852946).
12
12
  **Inputs**
13
13
  * **X** (heterogeneous) - **T**:
14
14
  Input tensor
15
15
  **Outputs**
16
16
  * **Y** (heterogeneous) - **T**:
17
17
  Output tensor
18
18
  **Type Constraints**
19
19
  * **T** in (
20
20
  tensor(double),
21
21
  tensor(float),
22
22
  tensor(float16)
23
23
  ):
24
24
  Constrain input and output types to float tensors.