Relu - 1 vs 6

Files changed (1) hide show
  1. Relu1 → Relu6 +0 -5
Relu1 → Relu6 RENAMED
@@ -1 +1 @@
1
1
  Relu takes one input data (Tensor<T>) and produces one output data
2
2
  (Tensor<T>) where the rectified linear function, y = max(0, x), is applied to
3
3
  the tensor elementwise.
4
-
5
- **Attributes**
6
-
7
- * **consumed_inputs**:
8
- legacy optimization attribute.
9
4
  **Inputs**
10
5
  * **X** (heterogeneous) - **T**:
11
6
  Input tensor
12
7
  **Outputs**
13
8
  * **Y** (heterogeneous) - **T**:
14
9
  Output tensor
15
10
  **Type Constraints**
16
11
  * **T** in (
17
12
  tensor(double),
18
13
  tensor(float),
19
14
  tensor(float16)
20
15
  ):
21
16
  Constrain input and output types to float tensors.