Selu#
Selu - 6#
Version
name: Selu (GitHub)
domain: main
since_version: 6
function: False
support_level: SupportType.COMMON
shape inference: True
This version of the operator has been available since version 6.
Summary
Selu takes one input data (Tensor<T>) and produces one output data (Tensor<T>) where the scaled exponential linear unit function, y = gamma * (alpha * e^x - alpha) for x <= 0, y = gamma * x for x > 0, is applied to the tensor elementwise.
Attributes
alpha: Coefficient of SELU default to 1.67326319217681884765625 (i.e., float32 approximation of 1.6732632423543772848170429916717). Default value is
1.6732631921768188
.gamma: Coefficient of SELU default to 1.05070102214813232421875 (i.e., float32 approximation of 1.0507009873554804934193349852946). Default value is
1.0507010221481323
.
Inputs
X (heterogeneous) - T: Input tensor
Outputs
Y (heterogeneous) - T: Output tensor
Type Constraints
T in ( tensor(double), tensor(float), tensor(float16) ): Constrain input and output types to float tensors.
Examples
selu_default
default_alpha = 1.67326319217681884765625
default_gamma = 1.05070102214813232421875
node = onnx.helper.make_node(
'Selu',
inputs=['x'],
outputs=['y'],
)
x = np.random.randn(3, 4, 5).astype(np.float32)
y = np.clip(x, 0, np.inf) * default_gamma + \
(np.exp(np.clip(x, -np.inf, 0)) - 1) * default_alpha * default_gamma
expect(node, inputs=[x], outputs=[y],
name='test_selu_default')
Differences
0 | 0 | Selu takes one input data (Tensor | Selu takes one input data (Tensor |
1 | 1 | (Tensor | (Tensor |
2 | 2 | y = gamma * (alpha * e^x - alpha) for x <= 0, y = gamma * x for x > 0, | y = gamma * (alpha * e^x - alpha) for x <= 0, y = gamma * x for x > 0, |
3 | 3 | is applied to the tensor elementwise. | is applied to the tensor elementwise. |
4 | 4 |
|
|
5 | 5 | **Attributes** | **Attributes** |
6 | 6 |
|
|
7 | 7 | * **alpha**: | * **alpha**: |
8 | 8 | Coefficient of SELU default to 1.6732. Default value is 1.673200011253357. |
|
9 | 9 | * **consumed_inputs**: |
|
10 | * **gamma**: | ||
10 | 11 | legacy optimization attribute. |
|
11 | 12 | * **gamma**: |
|
12 | Coefficient of SELU default to 1.0507. Default value is 1.0506999492645264. | ||
13 | 13 |
|
|
14 | 14 | **Inputs** | **Inputs** |
15 | 15 |
|
|
16 | 16 | * **X** (heterogeneous) - **T**: | * **X** (heterogeneous) - **T**: |
17 | 17 | Input tensor | Input tensor |
18 | 18 |
|
|
19 | 19 | **Outputs** | **Outputs** |
20 | 20 |
|
|
21 | 21 | * **Y** (heterogeneous) - **T**: | * **Y** (heterogeneous) - **T**: |
22 | 22 | Output tensor | Output tensor |
23 | 23 |
|
|
24 | 24 | **Type Constraints** | **Type Constraints** |
25 | 25 |
|
|
26 | 26 | * **T** in ( | * **T** in ( |
27 | 27 | tensor(double), | tensor(double), |
28 | 28 | tensor(float), | tensor(float), |
29 | 29 | tensor(float16) | tensor(float16) |
30 | 30 | ): | ): |
31 | 31 | Constrain input and output types to float tensors. | Constrain input and output types to float tensors. |
Selu - 1#
Version
name: Selu (GitHub)
domain: main
since_version: 1
function: False
support_level: SupportType.COMMON
shape inference: False
This version of the operator has been available since version 1.
Summary
Selu takes one input data (Tensor<T>) and produces one output data (Tensor<T>) where the scaled exponential linear unit function, y = gamma * (alpha * e^x - alpha) for x <= 0, y = gamma * x for x > 0, is applied to the tensor elementwise.
Attributes
alpha: Coefficient of SELU default to 1.6732. Default value is
1.673200011253357
.consumed_inputs: legacy optimization attribute.
gamma: Coefficient of SELU default to 1.0507. Default value is
1.0506999492645264
.
Inputs
X (heterogeneous) - T: Input tensor
Outputs
Y (heterogeneous) - T: Output tensor
Type Constraints
T in ( tensor(double), tensor(float), tensor(float16) ): Constrain input and output types to float tensors.