GlobalLpPool - 1 vs 2¶
GlobalLpPool1 → GlobalLpPool2
RENAMED
@@ -1 +1 @@
|
|
1
|
-
GlobalLpPool consumes an input tensor X and applies lp pool pooling across
|
1
|
+
GlobalLpPool consumes an input tensor X and applies lp pool pooling across
|
2
2
|
the values in the same channel. This is equivalent to LpPool with kernel size
|
3
3
|
equal to the spatial dimension of input tensor.
|
4
4
|
**Attributes**
|
5
5
|
* **p**:
|
6
|
-
p value of the Lp norm used to pool over the input data
|
6
|
+
p value of the Lp norm used to pool over the input data.
|
7
|
-
2.0.
|
8
7
|
**Inputs**
|
9
8
|
* **X** (heterogeneous) - **T**:
|
10
9
|
Input data tensor from the previous operator; dimensions for image
|
11
10
|
case are (N x C x H x W), where N is the batch size, C is the number
|
12
11
|
of channels, and H and W are the height and the width of the data.
|
13
|
-
For non image case, the
|
12
|
+
For non image case, the dimensions are in the form of (N x C x D1 x
|
14
13
|
D2 ... Dn), where N is the batch size.
|
15
14
|
**Outputs**
|
16
15
|
* **Y** (heterogeneous) - **T**:
|
17
|
-
Output data tensor from pooling across the input tensor.
|
16
|
+
Output data tensor from pooling across the input tensor. The output
|
17
|
+
tensor has the same rank as the input. The first two dimensions of
|
18
|
+
output shape are the same as the input (N x C), while the other
|
18
|
-
|
19
|
+
dimensions are all 1.
|
19
20
|
**Type Constraints**
|
20
21
|
* **T** in (
|
21
22
|
tensor(double),
|
22
23
|
tensor(float),
|
23
24
|
tensor(float16)
|
24
25
|
):
|
25
26
|
Constrain input and output types to float tensors.
|