Selu#
Selu - 6#
Version
name: Selu (GitHub)
domain: main
since_version: 6
function: True
support_level: SupportType.COMMON
shape inference: True
This version of the operator has been available since version 6.
Summary
Selu takes one input data (Tensor<T>) and produces one output data (Tensor<T>) where the scaled exponential linear unit function, y = gamma * (alpha * e^x - alpha) for x <= 0, y = gamma * x for x > 0, is applied to the tensor elementwise.
Attributes
alpha: Coefficient of SELU default to 1.67326319217681884765625 (i.e., float32 approximation of 1.6732632423543772848170429916717).
gamma: Coefficient of SELU default to 1.05070102214813232421875 (i.e., float32 approximation of 1.0507009873554804934193349852946).
Inputs
X (heterogeneous) - T: Input tensor
Outputs
Y (heterogeneous) - T: Output tensor
Type Constraints
T in ( tensor(double), tensor(float), tensor(float16) ): Constrain input and output types to float tensors.
Examples
default
import numpy as np
import onnx
node = onnx.helper.make_node(
"Selu", inputs=["x"], outputs=["y"], alpha=2.0, gamma=3.0
)
x = np.array([-1, 0, 1]).astype(np.float32)
# expected output [-3.79272318, 0., 3.]
y = (
np.clip(x, 0, np.inf) * 3.0
+ (np.exp(np.clip(x, -np.inf, 0)) - 1) * 2.0 * 3.0
)
expect(node, inputs=[x], outputs=[y], name="test_selu_example")
x = np.random.randn(3, 4, 5).astype(np.float32)
y = (
np.clip(x, 0, np.inf) * 3.0
+ (np.exp(np.clip(x, -np.inf, 0)) - 1) * 2.0 * 3.0
)
expect(node, inputs=[x], outputs=[y], name="test_selu")
_selu_default
import numpy as np
import onnx
default_alpha = 1.67326319217681884765625
default_gamma = 1.05070102214813232421875
node = onnx.helper.make_node(
"Selu",
inputs=["x"],
outputs=["y"],
)
x = np.random.randn(3, 4, 5).astype(np.float32)
y = (
np.clip(x, 0, np.inf) * default_gamma
+ (np.exp(np.clip(x, -np.inf, 0)) - 1) * default_alpha * default_gamma
)
expect(node, inputs=[x], outputs=[y], name="test_selu_default")
Selu - 1#
Version
name: Selu (GitHub)
domain: main
since_version: 1
function: False
support_level: SupportType.COMMON
shape inference: False
This version of the operator has been available since version 1.
Summary
Selu takes one input data (Tensor<T>) and produces one output data (Tensor<T>) where the scaled exponential linear unit function, y = gamma * (alpha * e^x - alpha) for x <= 0, y = gamma * x for x > 0, is applied to the tensor elementwise.
Attributes
alpha: Coefficient of SELU default to 1.6732.
consumed_inputs: legacy optimization attribute.
gamma: Coefficient of SELU default to 1.0507.
Inputs
X (heterogeneous) - T: Input tensor
Outputs
Y (heterogeneous) - T: Output tensor
Type Constraints
T in ( tensor(double), tensor(float), tensor(float16) ): Constrain input and output types to float tensors.