BrainScript Full Function Reference
This section provides information on BrainScript built-in functions.
The declarations of all built-in functions can be found in the CNTK.core.bs
located next to the CNTK binary.
The primitive operations and layers are declared in the global namespace. Additional operations are declared in namespaces, and will be given with the respective prefix (e.g. BS.RNN.LSTMP
).
Layers
DenseLayer
{outDim, bias= true, activation=Identity, init='uniform', initValueScale=1}
ConvolutionalLayer
{numOutputChannels, filterShape, activation = Identity,
init = "uniform", initValueScale = 1,
stride = 1, pad = false, lowerPad = 0, upperPad = 0,
bias=true}
MaxPoolingLayer
{filterShape, stride = 1, pad = false, lowerPad = 0, upperPad = 0}
AveragePoolingLayer
{filterShape, stride = 1, pad = false, lowerPad = 0, upperPad = 0}
EmbeddingLayer
{outDim, embeddingPath = '', transpose = false}
RecurrentLSTMLayer
{outputDim, cellShape = None, goBackwards = false, enableSelfStabilization = false}
DelayLayer
{T=1, defaultHiddenActivation=0}
Dropout
BatchNormalizationLayer
{spatialRank = 0, initialScale = 1, normalizationTimeConstant = 0, blendTimeConstant = 0, epsilon = 0.00001, useCntkEngine = true}
LayerNormalizationLayer
{initialScale = 1, initialBias = 0}
StabilizerLayer{}
FeatureMVNLayer{}
Layer building
Activation functions
Elementwise operations, unary
Abs
(x)
Ceil
(x)
Cosine
(x)
Clip
(x, minValue, maxValue)
Exp
(x)
Floor
(x)
Log
(x)
Negate
(x)
-x
BS.Boolean.Not
(b)
!x
Reciprocal
(x)
Round
(x)
Sin
(x)
Sqrt
(x)
Elementwise operations, binary
ElementTimes
(x, y)
x .* y
Minus
(x, y)
x - y
Plus
(x, y)
x + y
`LogPlus
(x, y)
Less
(x, y)
Equal
(x, y)
Greater
(x, y)
GreaterEqual
(x, y)
NotEqual
(x, y)
LessEqual
(x, y)
BS.Boolean.And
(a, b)
BS.Boolean.Or
(a, b)
BS.Boolean.Xor
(a, b)
Elementwise operations, ternary
BS.Boolean.If
(condition, thenVal, elseVal)
Matrix product and convolution operations
Times
(A, B, outputRank=1)
A * B
TransposeTimes
(A, B, outputRank=1)
Convolution
(weights, x, kernelShape, mapDims=(0), stride=(1), sharing=(true), autoPadding=(true), lowerPadding=(0), upperPadding=(0), imageLayout='CHW', maxTempMemSizeInSamples=0)
Pooling
(x, poolKind/*'max'|'average'*/, kernelShape, stride=(1), autoPadding=(true), lowerPadding=(0), upperPadding=(0), imageLayout='CHW')
ROIPooling
(x, rois, roiOutputShape, spatialScale=1.0/16.0)
Learnable parameters and constants
ParameterTensor
{shape, learningRateMultiplier=1.0, init='uniform'/*|gaussian*/, initValueScale=1.0, initValue=0.0, randomSeed=-1, initFromFilePath=''}
Constant
{scalarValue, rows = 1, cols = 1}
BS.Constants.Zero
,BS.Constants.One
BS.Constants.True
,BS.Constants.False
,BS.Constants.None
BS.Constants.OnesTensor (shape)
BS.Constants.ZeroSequenceLike (x)
Inputs
Input
(shape, dynamicAxis='', sparse=false, tag='feature')
DynamicAxis{}
EnvironmentInput (propertyName)
Mean (x)
,InvStdDev (x)
Loss functions and metrics
CrossEntropyWithSoftmax
(targetDistribution, nonNormalizedLogClassPosteriors)
CrossEntropy
(targetDistribution, classPosteriors)
Logistic
(label, probability)
WeightedLogistic
(label, probability, instanceWeight)
ClassificationError
(labels, nonNormalizedLogClassPosteriors)
MatrixL1Reg(matrix)
MatrixL2Reg(matrix)
SquareError (x, y)
Reductions
ReduceSum
(z, axis=None)
ReduceLogSum
(z, axis=None)
ReduceMean
(z, axis=None)
ReduceMin
(z, axis=None)
ReduceMax
(z, axis=None)
CosDistance (x, y)
SumElements (z)
Training operations
BatchNormalization
(input, scale, bias, runMean, runInvStdDev, spatial, normalizationTimeConstant = 0, blendTimeConstant = 0, epsilon = 0.00001, useCntkEngine = true, imageLayout='CHW')
Dropout
(x)
Stabilize (x, enabled=true)
StabilizeElements (x, inputDim=x.dim, enabled=true)
CosDistanceWithNegativeSamples (x, y, numShifts, numNegSamples)
Reshaping operations
CNTK2.Reshape (x, shape, beginAxis=0, endAxis=0)
ReshapeDimension (x, axis, shape) = CNTK2.Reshape (x, shape, beginAxis=axis, endAxis=axis + 1)
FlattenDimensions (x, axis, num) = CNTK2.Reshape (x, 0, beginAxis=axis, endAxis=axis + num)
SplitDimension (x, axis, N) = ReshapeDimension (x, axis, 0:N)
Slice (beginIndex, endIndex, input, axis=1)
BS.Sequences.First (x) = Slice (0, 1, x, axis=-1)
BS.Sequences.Last (x) = Slice (-1, 0, x, axis=-1)
Splice (inputs, axis=1)
TransposeDimensions (x, axis1, axis2)
Transpose (x) = TransposeDimensions (x, 1, 2)
BS.Sequences.BroadcastSequenceAs (type, data1)
BS.Sequences.Gather (where, x)
BS.Sequences.Scatter (where, y)
BS.Sequences.IsFirst (x)
BS.Sequences.IsLast (x)
Recurrence
OptimizedRNNStack
(weights, input, hiddenDims, numLayers=1, bidirectional=false, recurrentOp='lstm')
BS.Loop.Previous (x, timeStep=1, defaultHiddenActivation=0)
PastValue (shape, x, defaultHiddenActivation=0.1, ...) = BS.Loop.Previous (0, shape, ...)
BS.Loop.Next (x, timeStep=1, defaultHiddenActivation=0)
FutureValue (shape, x, defaultHiddenActivation=0.1, ...) = BS.Loop.Next (0, shape, ...)
LSTMP (outputDim, cellDim=outputDim, x, inputDim=x.shape, aux=BS.Constants.None, auxDim=aux.shape, prevState, enableSelfStabilization=false)
BS.Boolean.Toggle (clk, initialValue=BS.Constants.False)
BS.RNNs.RecurrentLSTMP (outputDim, cellDim=outputDim, x, inputDim=x.shape, previousHook=BS.RNNs.PreviousHC, augmentInputHook=NoAuxInputHook, augmentInputDim=0, layerIndex=0, enableSelfStabilization=false)
BS.RNNs.RecurrentLSTMPStack (layerShapes, cellDims=layerShapes, input, inputShape=input.shape, previousHook=PreviousHC, augmentInputHook=NoAuxInputHook, augmentInputShape=0, enableSelfStabilization=false)
BS.RNNs.RecurrentBirectionalLSTMPStack (layerShapes, cellDims=layerShapes, input, inputShape=input.dim, previousHook=PreviousHC, nextHook=NextHC, enableSelfStabilization=false)
Sequence-to-sequence support
BS.Seq2Seq.CreateAugmentWithFixedWindowAttentionHook (attentionDim, attentionSpan, decoderDynamicAxis, encoderOutput, enableSelfStabilization=false)
BS.Seq2Seq.GreedySequenceDecoderFrom (modelAsTrained)
BS.Seq2Seq.BeamSearchSequenceDecoderFrom (modelAsTrained, beamDepth)
Special-purpose operations
ClassBasedCrossEntropyWithSoftmax (labelClassDescriptorVectorSequence, mainInputInfo, mainWeight, classLogProbsBeforeSoftmax)
Model editing
BS.Network.Load (pathName)
BS.Network.Edit (inputModel, editFunctions, additionalRoots)
BS.Network.CloneFunction (inputNodes, outputNodes, parameters="learnable" /*|"constant"|"shared"*/)
Other
Fail (what)
IsSameObject (a, b)
Trace (node, say='', logFrequency=traceFrequency, logFirst=10, logGradientToo=false, onlyUpToRow=100000000, onlyUpToT=100000000, format=[])
Deprecated
ErrorPrediction
(labels, nonNormalizedLogClassPosteriors)
ColumnElementTimes (...) = ElementTimes (...)
DiagTimes (...) = ElementTimes (...)
LearnableParameter(...) = Parameter(...)
LookupTable (embeddingMatrix, inputTensor)
RowRepeat (input, numRepeats)
RowSlice (beginIndex, numRows, input) = Slice(beginIndex, beginIndex + numRows, input, axis = 1)
RowStack (inputs)
RowElementTimes (...) = ElementTimes (...)
Scale (...) = ElementTimes (...)
ConstantTensor (scalarVal, shape)
Parameter (outputDim, inputDim, ...) = ParameterTensor ((outputDim:input), ...)
WeightParam (outputDim, inputDim) = Parameter (outputDim, inputDim, init='uniform', initValueScale=1, initOnCPUOnly=true, randomSeed=1)
DiagWeightParam (outputDim) = ParameterTensor ((outputDim), init='uniform', initValueScale=1, initOnCPUOnly=true, randomSeed=1)
BiasParam (dim) = ParameterTensor ((dim), init='fixedValue', value=0.0)
ScalarParam() = BiasParam (1)
SparseInput (shape, dynamicAxis='', tag='feature')
ImageInput (imageWidth, imageHeight, imageChannels, imageLayout='CHW', dynamicAxis='', tag='feature')
SparseImageInput (imageWidth, imageHeight, imageChannels, imageLayout='CHW', dynamicAxis='', tag='feature')
MeanVarNorm(feat) = PerDimMeanVarNormalization(feat, Mean (feat), InvStdDev (feat))
PerDimMeanVarNormalization (x, mean, invStdDev)
,
PerDimMeanVarDeNormalization (x, mean, invStdDev)
ReconcileDynamicAxis (dataInput, layoutInput)