This class implements a layer that converts a bitset into vectors of ones and zeros.
void SetBitSetSize( int bitSetSize );
The layer has no trainable parameters.
The single input accepts a blob with int
data, of the dimensions:
BatchLength * BatchWidth * ListSize * Height * Width * Depth
is the number of bitsetsChannels
contains the bitset itself
The single output contains a blob of the dimensions:
Channels
is equal toGetBitSetSize()
- the other dimensions are equal to the input dimensions