LeakyRelu#
LeakyRelu - 16#
Version
name: LeakyRelu (GitHub)
domain: main
since_version: 16
function: True
support_level: SupportType.COMMON
shape inference: True
This version of the operator has been available since version 16.
Summary
LeakyRelu takes input data (Tensor<T>) and an argument alpha, and produces one output data (Tensor<T>) where the function f(x) = alpha * x for x < 0, f(x) = x for x >= 0, is applied to the data tensor elementwise.
History - Version 16 adds bfloat16 to the types allowed.
Attributes
alpha: Coefficient of leakage.
Inputs
X (heterogeneous) - T: Input tensor
Outputs
Y (heterogeneous) - T: Output tensor
Type Constraints
T in ( tensor(bfloat16), tensor(double), tensor(float), tensor(float16) ): Constrain input and output types to float tensors.
Examples
default
import numpy as np
import onnx
node = onnx.helper.make_node(
"LeakyRelu", inputs=["x"], outputs=["y"], alpha=0.1
)
x = np.array([-1, 0, 1]).astype(np.float32)
# expected output [-0.1, 0., 1.]
y = np.clip(x, 0, np.inf) + np.clip(x, -np.inf, 0) * 0.1
expect(node, inputs=[x], outputs=[y], name="test_leakyrelu_example")
x = np.random.randn(3, 4, 5).astype(np.float32)
y = np.clip(x, 0, np.inf) + np.clip(x, -np.inf, 0) * 0.1
expect(node, inputs=[x], outputs=[y], name="test_leakyrelu")
_leakyrelu_default
import numpy as np
import onnx
default_alpha = 0.01
node = onnx.helper.make_node(
"LeakyRelu",
inputs=["x"],
outputs=["y"],
)
x = np.random.randn(3, 4, 5).astype(np.float32)
y = np.clip(x, 0, np.inf) + np.clip(x, -np.inf, 0) * default_alpha
expect(node, inputs=[x], outputs=[y], name="test_leakyrelu_default")
LeakyRelu - 6#
Version
name: LeakyRelu (GitHub)
domain: main
since_version: 6
function: False
support_level: SupportType.COMMON
shape inference: True
This version of the operator has been available since version 6.
Summary
LeakyRelu takes input data (Tensor<T>) and an argument alpha, and produces one output data (Tensor<T>) where the function f(x) = alpha * x for x < 0, f(x) = x for x >= 0, is applied to the data tensor elementwise.
Attributes
alpha: Coefficient of leakage.
Inputs
X (heterogeneous) - T: Input tensor
Outputs
Y (heterogeneous) - T: Output tensor
Type Constraints
T in ( tensor(double), tensor(float), tensor(float16) ): Constrain input and output types to float tensors.
LeakyRelu - 1#
Version
name: LeakyRelu (GitHub)
domain: main
since_version: 1
function: False
support_level: SupportType.COMMON
shape inference: False
This version of the operator has been available since version 1.
Summary
LeakyRelu takes input data (Tensor<T>) and an argument alpha, and produces one output data (Tensor<T>) where the function f(x) = alpha * x for x < 0, f(x) = x for x >= 0, is applied to the data tensor elementwise.
Attributes
alpha: Coefficient of leakage default to 0.01.
consumed_inputs: legacy optimization attribute.
Inputs
X (heterogeneous) - T: Input tensor
Outputs
Y (heterogeneous) - T: Output tensor
Type Constraints
T in ( tensor(double), tensor(float), tensor(float16) ): Constrain input and output types to float tensors.