Relu - 6 vs 13#

Next section compares an older to a newer version of the same operator after both definition are converted into markdown text. Green means an addition to the newer version, red means a deletion. Anything else is unchanged.

Files changed (1) hide show
  1. Relu6 → Relu13 +0 -1
Relu6 → Relu13 RENAMED
@@ -1 +1 @@
1
1
  Relu takes one input data (Tensor<T>) and produces one output data
2
2
  (Tensor<T>) where the rectified linear function, y = max(0, x), is applied to
3
3
  the tensor elementwise.
4
4
  **Inputs**
5
5
  * **X** (heterogeneous) - **T**:
6
6
  Input tensor
7
7
  **Outputs**
8
8
  * **Y** (heterogeneous) - **T**:
9
9
  Output tensor
10
10
  **Type Constraints**
11
11
  * **T** in (
12
- tensor(bfloat16),
13
12
  tensor(double),
14
13
  tensor(float),
15
14
  tensor(float16)
16
15
  ):
17
16
  Constrain input and output types to float tensors.