Referência de função completa do BrainScript
Esta seção fornece informações sobre funções internas do BrainScript.
As declarações de todas as funções internas podem ser encontradas no CNTK.core.bs
localizado ao lado do binário CNTK.
As operações primitivas e as camadas são declaradas no namespace global. Operações adicionais são declaradas em namespaces e serão fornecidas com o respectivo prefixo (por exemplo, BS.RNN.LSTMP
).
Camadas
DenseLayer
{outDim, bias= true, activation=Identity, init='uniform', initValueScale=1}
ConvolutionalLayer
{numOutputChannels, filterShape, activation = Identity,
init = "uniform", initValueScale = 1,
stride = 1, pad = false, lowerPad = 0, upperPad = 0,
bias=true}
MaxPoolingLayer
{filterShape, stride = 1, pad = false, lowerPad = 0, upperPad = 0}
AveragePoolingLayer
{filterShape, stride = 1, pad = false, lowerPad = 0, upperPad = 0}
EmbeddingLayer
{outDim, embeddingPath = '', transpose = false}
RecurrentLSTMLayer
{outputDim, cellShape = None, goBackwards = false, enableSelfStabilization = false}
DelayLayer
{T=1, defaultHiddenActivation=0}
Dropout
BatchNormalizationLayer
{spatialRank = 0, initialScale = 1, normalizationTimeConstant = 0, blendTimeConstant = 0, epsilon = 0.00001, useCntkEngine = true}
LayerNormalizationLayer
{initialScale = 1, initialBias = 0}
StabilizerLayer{}
FeatureMVNLayer{}
Criação de camadas
Funções de ativação
Operações elementwise, unary
Abs
(x)
Ceil
(x)
Cosine
(x)
Clip
(x, minValue, maxValue)
Exp
(x)
Floor
(x)
Log
(x)
Negate
(x)
-x
BS.Boolean.Not
(b)
!x
Reciprocal
(x)
Round
(x)
Sin
(x)
Sqrt
(x)
Operações elementwise, binárias
ElementTimes
(x, y)
x .* y
Minus
(x, y)
x - y
Plus
(x, y)
x + y
`LogPlus
(x, y)
Less
(x, y)
Equal
(x, y)
Greater
(x, y)
GreaterEqual
(x, y)
NotEqual
(x, y)
LessEqual
(x, y)
BS.Boolean.And
(a, b)
BS.Boolean.Or
(a, b)
BS.Boolean.Xor
(a, b)
Operações elementwise, ternary
BS.Boolean.If
(condition, thenVal, elseVal)
Operações de produto e convolução de matriz
Times
(A, B, outputRank=1)
A * B
TransposeTimes
(A, B, outputRank=1)
Convolution
(weights, x, kernelShape, mapDims=(0), stride=(1), sharing=(true), autoPadding=(true), lowerPadding=(0), upperPadding=(0), imageLayout='CHW', maxTempMemSizeInSamples=0)
Pooling
(x, poolKind/*'max'|'average'*/, kernelShape, stride=(1), autoPadding=(true), lowerPadding=(0), upperPadding=(0), imageLayout='CHW')
ROIPooling
(x, rois, roiOutputShape, spatialScale=1.0/16.0)
Parâmetros e constantes que podem ser aprendidos
ParameterTensor
{shape, learningRateMultiplier=1.0, init='uniform'/*|gaussian*/, initValueScale=1.0, initValue=0.0, randomSeed=-1, initFromFilePath=''}
Constant
{scalarValue, rows = 1, cols = 1}
-
BS.Constants.Zero
,BS.Constants.One
BS.Constants.True
,BS.Constants.False
,BS.Constants.None
BS.Constants.OnesTensor (shape)
BS.Constants.ZeroSequenceLike (x)
Entradas
Input
(shape, dynamicAxis='', sparse=false, tag='feature')
DynamicAxis{}
EnvironmentInput (propertyName)
Mean (x)
,InvStdDev (x)
Funções e métricas de perda
CrossEntropyWithSoftmax
(targetDistribution, nonNormalizedLogClassPosteriors)
CrossEntropy
(targetDistribution, classPosteriors)
Logistic
(label, probability)
WeightedLogistic
(label, probability, instanceWeight)
ClassificationError
(labels, nonNormalizedLogClassPosteriors)
MatrixL1Reg(matrix)
MatrixL2Reg(matrix)
SquareError (x, y)
Reduções
ReduceSum
(z, axis=None)
ReduceLogSum
(z, axis=None)
ReduceMean
(z, axis=None)
ReduceMin
(z, axis=None)
ReduceMax
(z, axis=None)
CosDistance (x, y)
SumElements (z)
Operações de treinamento
BatchNormalization
(input, scale, bias, runMean, runInvStdDev, spatial, normalizationTimeConstant = 0, blendTimeConstant = 0, epsilon = 0.00001, useCntkEngine = true, imageLayout='CHW')
-
Dropout
(x)
Stabilize (x, enabled=true)
StabilizeElements (x, inputDim=x.dim, enabled=true)
CosDistanceWithNegativeSamples (x, y, numShifts, numNegSamples)
Remodelando operações
CNTK2.Reshape (x, shape, beginAxis=0, endAxis=0)
ReshapeDimension (x, axis, shape) = CNTK2.Reshape (x, shape, beginAxis=axis, endAxis=axis + 1)
FlattenDimensions (x, axis, num) = CNTK2.Reshape (x, 0, beginAxis=axis, endAxis=axis + num)
SplitDimension (x, axis, N) = ReshapeDimension (x, axis, 0:N)
Slice (beginIndex, endIndex, input, axis=1)
BS.Sequences.First (x) = Slice (0, 1, x, axis=-1)
BS.Sequences.Last (x) = Slice (-1, 0, x, axis=-1)
Splice (inputs, axis=1)
TransposeDimensions (x, axis1, axis2)
Transpose (x) = TransposeDimensions (x, 1, 2)
BS.Sequences.BroadcastSequenceAs (type, data1)
BS.Sequences.Gather (where, x)
BS.Sequences.Scatter (where, y)
BS.Sequences.IsFirst (x)
BS.Sequences.IsLast (x)
Recorrência
OptimizedRNNStack
(weights, input, hiddenDims, numLayers=1, bidirectional=false, recurrentOp='lstm')
BS.Loop.Previous (x, timeStep=1, defaultHiddenActivation=0)
PastValue (shape, x, defaultHiddenActivation=0.1, ...) = BS.Loop.Previous (0, shape, ...)
BS.Loop.Next (x, timeStep=1, defaultHiddenActivation=0)
FutureValue (shape, x, defaultHiddenActivation=0.1, ...) = BS.Loop.Next (0, shape, ...)
LSTMP (outputDim, cellDim=outputDim, x, inputDim=x.shape, aux=BS.Constants.None, auxDim=aux.shape, prevState, enableSelfStabilization=false)
BS.Boolean.Toggle (clk, initialValue=BS.Constants.False)
BS.RNNs.RecurrentLSTMP (outputDim, cellDim=outputDim, x, inputDim=x.shape, previousHook=BS.RNNs.PreviousHC, augmentInputHook=NoAuxInputHook, augmentInputDim=0, layerIndex=0, enableSelfStabilization=false)
BS.RNNs.RecurrentLSTMPStack (layerShapes, cellDims=layerShapes, input, inputShape=input.shape, previousHook=PreviousHC, augmentInputHook=NoAuxInputHook, augmentInputShape=0, enableSelfStabilization=false)
BS.RNNs.RecurrentBirectionalLSTMPStack (layerShapes, cellDims=layerShapes, input, inputShape=input.dim, previousHook=PreviousHC, nextHook=NextHC, enableSelfStabilization=false)
Suporte de sequência para sequência
BS.Seq2Seq.CreateAugmentWithFixedWindowAttentionHook (attentionDim, attentionSpan, decoderDynamicAxis, encoderOutput, enableSelfStabilization=false)
BS.Seq2Seq.GreedySequenceDecoderFrom (modelAsTrained)
BS.Seq2Seq.BeamSearchSequenceDecoderFrom (modelAsTrained, beamDepth)
Operações de finalidade especial
ClassBasedCrossEntropyWithSoftmax (labelClassDescriptorVectorSequence, mainInputInfo, mainWeight, classLogProbsBeforeSoftmax)
Edição de modelo
BS.Network.Load (pathName)
BS.Network.Edit (inputModel, editFunctions, additionalRoots)
BS.Network.CloneFunction (inputNodes, outputNodes, parameters="learnable" /*|"constant"|"shared"*/)
Outro
Fail (what)
IsSameObject (a, b)
Trace (node, say='', logFrequency=traceFrequency, logFirst=10, logGradientToo=false, onlyUpToRow=100000000, onlyUpToT=100000000, format=[])
Deprecado
ErrorPrediction
(labels, nonNormalizedLogClassPosteriors)
ColumnElementTimes (...) = ElementTimes (...)
DiagTimes (...) = ElementTimes (...)
LearnableParameter(...) = Parameter(...)
LookupTable (embeddingMatrix, inputTensor)
RowRepeat (input, numRepeats)
RowSlice (beginIndex, numRows, input) = Slice(beginIndex, beginIndex + numRows, input, axis = 1)
RowStack (inputs)
RowElementTimes (...) = ElementTimes (...)
Scale (...) = ElementTimes (...)
ConstantTensor (scalarVal, shape)
Parameter (outputDim, inputDim, ...) = ParameterTensor ((outputDim:input), ...)
WeightParam (outputDim, inputDim) = Parameter (outputDim, inputDim, init='uniform', initValueScale=1, initOnCPUOnly=true, randomSeed=1)
DiagWeightParam (outputDim) = ParameterTensor ((outputDim), init='uniform', initValueScale=1, initOnCPUOnly=true, randomSeed=1)
BiasParam (dim) = ParameterTensor ((dim), init='fixedValue', value=0.0)
ScalarParam() = BiasParam (1)
SparseInput (shape, dynamicAxis='', tag='feature')
ImageInput (imageWidth, imageHeight, imageChannels, imageLayout='CHW', dynamicAxis='', tag='feature')
SparseImageInput (imageWidth, imageHeight, imageChannels, imageLayout='CHW', dynamicAxis='', tag='feature')
MeanVarNorm(feat) = PerDimMeanVarNormalization(feat, Mean (feat), InvStdDev (feat))
PerDimMeanVarNormalization (x, mean, invStdDev)
,
PerDimMeanVarDeNormalization (x, mean, invStdDev)
ReconcileDynamicAxis (dataInput, layoutInput)