GlobalLpPool - 1 vs 2#
Next section compares an older to a newer version of the same operator after both definition are converted into markdown text. Green means an addition to the newer version, red means a deletion. Anything else is unchanged.
GlobalLpPool1 → GlobalLpPool2
RENAMED
@@ -1 +1 @@
|
|
1
|
-
GlobalLpPool consumes an input tensor X and applies lp pool pooling across
|
1
|
+
GlobalLpPool consumes an input tensor X and applies lp pool pooling across the
|
2
2
|
the values in the same channel. This is equivalent to LpPool with kernel size
|
3
3
|
equal to the spatial dimension of input tensor.
|
4
4
|
**Attributes**
|
5
5
|
* **p**:
|
6
|
-
p value of the Lp norm used to pool over the input data
|
6
|
+
p value of the Lp norm used to pool over the input data, default is
|
7
|
+
2.0.
|
7
8
|
**Inputs**
|
8
9
|
* **X** (heterogeneous) - **T**:
|
9
10
|
Input data tensor from the previous operator; dimensions for image
|
10
11
|
case are (N x C x H x W), where N is the batch size, C is the number
|
11
12
|
of channels, and H and W are the height and the width of the data.
|
12
|
-
For non image case, the
|
13
|
+
For non image case, the dimension are in the form of (N x C x D1 x
|
13
14
|
D2 ... Dn), where N is the batch size.
|
14
15
|
**Outputs**
|
15
16
|
* **Y** (heterogeneous) - **T**:
|
16
|
-
Output data tensor from pooling across the input tensor.
|
17
|
+
Output data tensor from pooling across the input tensor. Dimensions
|
18
|
+
will be N x C x 1 x 1
|
17
|
-
tensor has the same rank as the input. The first two dimensions of
|
18
|
-
output shape are the same as the input (N x C), while the other
|
19
|
-
dimensions are all 1.
|
20
19
|
**Type Constraints**
|
21
20
|
* **T** in (
|
22
21
|
tensor(double),
|
23
22
|
tensor(float),
|
24
23
|
tensor(float16)
|
25
24
|
):
|
26
25
|
Constrain input and output types to float tensors.
|