Relu - 1 vs 13#

Next section compares an older to a newer version of the same operator after both definition are converted into markdown text. Green means an addition to the newer version, red means a deletion. Anything else is unchanged.

Files changed (1) hide show
  1. Relu1 → Relu13 +5 -1
Relu1 → Relu13 RENAMED
@@ -1 +1 @@
1
1
  Relu takes one input data (Tensor<T>) and produces one output data
2
2
  (Tensor<T>) where the rectified linear function, y = max(0, x), is applied to
3
3
  the tensor elementwise.
4
+
5
+ **Attributes**
6
+
7
+ * **consumed_inputs**:
8
+ legacy optimization attribute.
4
9
  **Inputs**
5
10
  * **X** (heterogeneous) - **T**:
6
11
  Input tensor
7
12
  **Outputs**
8
13
  * **Y** (heterogeneous) - **T**:
9
14
  Output tensor
10
15
  **Type Constraints**
11
16
  * **T** in (
12
- tensor(bfloat16),
13
17
  tensor(double),
14
18
  tensor(float),
15
19
  tensor(float16)
16
20
  ):
17
21
  Constrain input and output types to float tensors.