PRelu#

PRelu - 16#

Version

  • name: PRelu (GitHub)

  • domain: main

  • since_version: 16

  • function: True

  • support_level: SupportType.COMMON

  • shape inference: True

This version of the operator has been available since version 16.

Summary

PRelu takes input data (Tensor<T>) and slope tensor as input, and produces one output data (Tensor<T>) where the function f(x) = slope * x for x < 0, f(x) = x for x >= 0., is applied to the data tensor elementwise.

History - Version 16 adds bfloat16 to the types allowed. This operator supports unidirectional broadcasting (tensor slope should be unidirectional broadcastable to input tensor X); for more details please check Broadcasting in ONNX.

Inputs

  • X (heterogeneous) - T: Input tensor

  • slope (heterogeneous) - T: Slope tensor. The shape of slope can be smaller then first input X; if so, its shape must be unidirectional broadcastable to X

Outputs

  • Y (heterogeneous) - T: Output tensor (same size as X)

Type Constraints

  • T in ( tensor(bfloat16), tensor(double), tensor(float), tensor(float16), tensor(int32), tensor(int64), tensor(uint32), tensor(uint64) ): Constrain input and output types to float/int tensors.

Examples

default

import numpy as np
import onnx

node = onnx.helper.make_node(
    "PRelu",
    inputs=["x", "slope"],
    outputs=["y"],
)

x = np.random.randn(3, 4, 5).astype(np.float32)
slope = np.random.randn(3, 4, 5).astype(np.float32)
y = np.clip(x, 0, np.inf) + np.clip(x, -np.inf, 0) * slope

expect(node, inputs=[x, slope], outputs=[y], name="test_prelu_example")

_prelu_broadcast

import numpy as np
import onnx

node = onnx.helper.make_node(
    "PRelu",
    inputs=["x", "slope"],
    outputs=["y"],
)

x = np.random.randn(3, 4, 5).astype(np.float32)
slope = np.random.randn(5).astype(np.float32)
y = np.clip(x, 0, np.inf) + np.clip(x, -np.inf, 0) * slope

expect(node, inputs=[x, slope], outputs=[y], name="test_prelu_broadcast")

PRelu - 9#

Version

  • name: PRelu (GitHub)

  • domain: main

  • since_version: 9

  • function: False

  • support_level: SupportType.COMMON

  • shape inference: True

This version of the operator has been available since version 9.

Summary

PRelu takes input data (Tensor<T>) and slope tensor as input, and produces one output data (Tensor<T>) where the function f(x) = slope * x for x < 0, f(x) = x for x >= 0., is applied to the data tensor elementwise. This operator supports unidirectional broadcasting (tensor slope should be unidirectional broadcastable to input tensor X); for more details please check Broadcasting in ONNX.

Inputs

  • X (heterogeneous) - T: Input tensor

  • slope (heterogeneous) - T: Slope tensor. The shape of slope can be smaller then first input X; if so, its shape must be unidirectional broadcastable to X

Outputs

  • Y (heterogeneous) - T: Output tensor (same size as X)

Type Constraints

  • T in ( tensor(double), tensor(float), tensor(float16), tensor(int32), tensor(int64), tensor(uint32), tensor(uint64) ): Constrain input and output types to float/int tensors.

PRelu - 7#

Version

  • name: PRelu (GitHub)

  • domain: main

  • since_version: 7

  • function: False

  • support_level: SupportType.COMMON

  • shape inference: True

This version of the operator has been available since version 7.

Summary

PRelu takes input data (Tensor<T>) and slope tensor as input, and produces one output data (Tensor<T>) where the function f(x) = slope * x for x < 0, f(x) = x for x >= 0., is applied to the data tensor elementwise. This operator supports unidirectional broadcasting (tensor slope should be unidirectional broadcastable to input tensor X); for more details please check Broadcasting in ONNX.

Inputs

  • X (heterogeneous) - T: Input tensor

  • slope (heterogeneous) - T: Slope tensor. The shape of slope can be smaller then first input X; if so, its shape must be unidirectional broadcastable to X

Outputs

  • Y (heterogeneous) - T: Output tensor (same size as X)

Type Constraints

  • T in ( tensor(double), tensor(float), tensor(float16) ): Constrain input and output types to float tensors.

PRelu - 6#

Version

  • name: PRelu (GitHub)

  • domain: main

  • since_version: 6

  • function: False

  • support_level: SupportType.COMMON

  • shape inference: True

This version of the operator has been available since version 6.

Summary

PRelu takes input data (Tensor<T>) and slope tensor as input, and produces one output data (Tensor<T>) where the function f(x) = slope * x for x < 0, f(x) = x for x >= 0., is applied to the data tensor elementwise.

Inputs

  • X (heterogeneous) - T: Input tensor

  • slope (heterogeneous) - T: Slope tensor. If Slope is of size 1, the value is sharedacross different channels

Outputs

  • Y (heterogeneous) - T: Output tensor

Type Constraints

  • T in ( tensor(double), tensor(float), tensor(float16) ): Constrain input and output types to float tensors.

PRelu - 1#

Version

  • name: PRelu (GitHub)

  • domain: main

  • since_version: 1

  • function: False

  • support_level: SupportType.COMMON

  • shape inference: False

This version of the operator has been available since version 1.

Summary

PRelu takes input data (Tensor<T>) and slope tensor as input, and produces one output data (Tensor<T>) where the function f(x) = slope * x for x < 0, f(x) = x for x >= 0., is applied to the data tensor elementwise.

Attributes

  • consumed_inputs: legacy optimization attribute.

Inputs

  • X (heterogeneous) - T: Input tensor

  • slope (heterogeneous) - T: Slope tensor. If Slope is of size 1, the value is sharedacross different channels

Outputs

  • Y (heterogeneous) - T: Output tensor

Type Constraints

  • T in ( tensor(double), tensor(float), tensor(float16) ): Constrain input and output types to float tensors.