This class implements a layer which performs mean pooling over one of the blob dimensions.
// Projection dimension
void SetDimenion( TBlobDim dimension );
The default value is BD_Width
.
void SetRestoreOriginalImageSize( bool flag );
If true
then output blob will be of the same size as input blob, and mean values will be broadcasted along pooling dimension.
If false
then output blob size along pooling dimension will be 1
.
The default value is false
.
The layer has no trainable parameters.
The single input accepts a blob of the following dimensions:
BatchLength * BatchWidth * ListSize
- the number of images in the setHeight
- images' heightWidth
- images' widthDepth * Channels
- the number of channels the image format uses
The single output contains a blob with the results.
If GetRestoreOriginalImageSize
is true
then output is of the same size as input.
If GetRestoreOriginalImageSize
is false
then the projection dimension of output size is equal to 1
and the rest of the dimensions are equal to the ones of the input.