Mish#

Mish - 18#

Version

  • name: Mish (GitHub)

  • domain: main

  • since_version: 18

  • function: True

  • support_level: SupportType.COMMON

  • shape inference: True

This version of the operator has been available since version 18.

Summary

Mish: A Self Regularized Non-Monotonic Neural Activation Function.

Perform the linear unit element-wise on the input tensor X using formula:

mish(x) = x * tanh(softplus(x)) = x * tanh(ln(1 + e^{x}))

Inputs

  • X (heterogeneous) - T: Input tensor

Outputs

  • Y (heterogeneous) - T: Output tensor

Type Constraints

  • T in ( tensor(double), tensor(float), tensor(float16) ): Constrain input X and output types to float tensors.

Examples

default

import numpy as np
import onnx

node = onnx.helper.make_node("Mish", inputs=["X"], outputs=["Y"])

input_data = np.linspace(-10, 10, 10000, dtype=np.float32)

# Calculate expected output data
expected_output = input_data * np.tanh(np.log1p(np.exp(input_data)))

expect(node, inputs=[input_data], outputs=[expected_output], name="test_mish")