Relu - 13 vs 14#

Next section compares an older to a newer version of the same operator after both definition are converted into markdown text. Green means an addition to the newer version, red means a deletion. Anything else is unchanged.

Files changed (1) hide show
  1. Relu13 → Relu14 +2 -6
Relu13 → Relu14 RENAMED
@@ -1 +1 @@
1
1
  Relu takes one input data (Tensor<T>) and produces one output data
2
2
  (Tensor<T>) where the rectified linear function, y = max(0, x), is applied to
3
3
  the tensor elementwise.
4
4
  **Inputs**
5
5
  * **X** (heterogeneous) - **T**:
6
6
  Input tensor
7
7
  **Outputs**
8
8
  * **Y** (heterogeneous) - **T**:
9
9
  Output tensor
10
10
  **Type Constraints**
11
11
  * **T** in (
12
12
  tensor(bfloat16),
13
13
  tensor(double),
14
14
  tensor(float),
15
- tensor(float16),
15
+ tensor(float16)
16
- tensor(int16),
17
- tensor(int32),
18
- tensor(int64),
19
- tensor(int8)
20
16
  ):
21
- Constrain input and output types to signed numeric tensors.? ^^^^^^^^^^^^^^
17
+ Constrain input and output types to float tensors.? ^^^^^