Softplus#
Softplus - 1#
Version
name: Softplus (GitHub)
domain: main
since_version: 1
function: True
support_level: SupportType.COMMON
shape inference: True
This version of the operator has been available since version 1.
Summary
Softplus takes one input data (Tensor<T>) and produces one output data (Tensor<T>) where the softplus function, y = ln(exp(x) + 1), is applied to the tensor elementwise.
Inputs
X (heterogeneous) - T: 1D input tensor
Outputs
Y (heterogeneous) - T: 1D input tensor
Type Constraints
T in ( tensor(double), tensor(float), tensor(float16) ): Constrain input and output types to float tensors.
Examples
default
import numpy as np
import onnx
node = onnx.helper.make_node(
"Softplus",
inputs=["x"],
outputs=["y"],
)
x = np.array([-1, 0, 1]).astype(np.float32)
y = np.log(
np.exp(x) + 1
) # expected output [0.31326166, 0.69314718, 1.31326163]
expect(node, inputs=[x], outputs=[y], name="test_softplus_example")
x = np.random.randn(3, 4, 5).astype(np.float32)
y = np.log(np.exp(x) + 1)
expect(node, inputs=[x], outputs=[y], name="test_softplus")