The class implements a layer that performs local response normalization using the following formula:
LRN(x)[obj][ch] = x[obj][ch] * / ((bias + alpha * sqrSum[obj][ch] / windowSize) ^ beta)
where:
obj
is index of object[0; BlobSize / Channels)
ch
is index of channel[0; Channels)
windowSize
,bias
,alpha
,beta
are settingssqrSum
is calculated using the following formula:
sqrSum(x)[obj][ch] = sum(x[obj][i] * x[obj][i] for each i in [ch_min, ch_max])
ch_min = max(0, ch - floor((windowSize - 1)/2))
ch_max = min(C - 1, ch + ceil((windowSize - 1)/2))
void SetWindowSize( int value );
Sets size of the window used during the calculation of sqrSum
.
void SetBias( float value );
Sets the bias value, which is added to the scaled sum of squares.
void SetAlpha( float value );
Sets the scale value. The sum of squares is multiplied by this value.
void SetBeta( float value );
Sets the exponent, used in the formula.
There are no trainable parameters for this layer.
The single input accepts a blob of any size.
The single output contains a blob of the same size with the results of local response normalization.