Referencia de función completa de BrainScript
En esta sección se proporciona información sobre las funciones integradas de BrainScript.
Las declaraciones de todas las funciones integradas se pueden encontrar en el CNTK.core.bs
situado junto al binario de CNTK.
Las operaciones y capas primitivas se declaran en el espacio de nombres global. Las operaciones adicionales se declaran en espacios de nombres y se proporcionarán con el prefijo correspondiente (por ejemplo, BS.RNN.LSTMP
).
Capas
DenseLayer
{outDim, bias= true, activation=Identity, init='uniform', initValueScale=1}
ConvolutionalLayer
{numOutputChannels, filterShape, activation = Identity,
init = "uniform", initValueScale = 1,
stride = 1, pad = false, lowerPad = 0, upperPad = 0,
bias=true}
MaxPoolingLayer
{filterShape, stride = 1, pad = false, lowerPad = 0, upperPad = 0}
AveragePoolingLayer
{filterShape, stride = 1, pad = false, lowerPad = 0, upperPad = 0}
EmbeddingLayer
{outDim, embeddingPath = '', transpose = false}
RecurrentLSTMLayer
{outputDim, cellShape = None, goBackwards = false, enableSelfStabilization = false}
DelayLayer
{T=1, defaultHiddenActivation=0}
Dropout
BatchNormalizationLayer
{spatialRank = 0, initialScale = 1, normalizationTimeConstant = 0, blendTimeConstant = 0, epsilon = 0.00001, useCntkEngine = true}
LayerNormalizationLayer
{initialScale = 1, initialBias = 0}
StabilizerLayer{}
FeatureMVNLayer{}
Creación de capas
Funciones de activación
Operaciones en forma de elemento, unaria
Abs
(x)
Ceil
(x)
Cosine
(x)
Clip
(x, minValue, maxValue)
Exp
(x)
Floor
(x)
Log
(x)
Negate
(x)
-x
BS.Boolean.Not
(b)
!x
Reciprocal
(x)
Round
(x)
Sin
(x)
Sqrt
(x)
Operaciones en forma de elemento, binarios
ElementTimes
(x, y)
x .* y
Minus
(x, y)
x - y
Plus
(x, y)
x + y
`LogPlus
(x, y)
Less
(x, y)
Equal
(x, y)
Greater
(x, y)
GreaterEqual
(x, y)
NotEqual
(x, y)
LessEqual
(x, y)
BS.Boolean.And
(a, b)
BS.Boolean.Or
(a, b)
BS.Boolean.Xor
(a, b)
Operaciones de elemento, ternario
BS.Boolean.If
(condition, thenVal, elseVal)
Operaciones de convolución y productos de matriz
Times
(A, B, outputRank=1)
A * B
TransposeTimes
(A, B, outputRank=1)
Convolution
(weights, x, kernelShape, mapDims=(0), stride=(1), sharing=(true), autoPadding=(true), lowerPadding=(0), upperPadding=(0), imageLayout='CHW', maxTempMemSizeInSamples=0)
Pooling
(x, poolKind/*'max'|'average'*/, kernelShape, stride=(1), autoPadding=(true), lowerPadding=(0), upperPadding=(0), imageLayout='CHW')
ROIPooling
(x, rois, roiOutputShape, spatialScale=1.0/16.0)
Parámetros y constantes que se pueden aprender
ParameterTensor
{shape, learningRateMultiplier=1.0, init='uniform'/*|gaussian*/, initValueScale=1.0, initValue=0.0, randomSeed=-1, initFromFilePath=''}
Constant
{scalarValue, rows = 1, cols = 1}
-
BS.Constants.Zero
,BS.Constants.One
BS.Constants.True
,BS.Constants.False
,BS.Constants.None
BS.Constants.OnesTensor (shape)
BS.Constants.ZeroSequenceLike (x)
Entradas
Input
(shape, dynamicAxis='', sparse=false, tag='feature')
DynamicAxis{}
EnvironmentInput (propertyName)
Mean (x)
,InvStdDev (x)
Funciones y métricas de pérdida
CrossEntropyWithSoftmax
(targetDistribution, nonNormalizedLogClassPosteriors)
CrossEntropy
(targetDistribution, classPosteriors)
Logistic
(label, probability)
WeightedLogistic
(label, probability, instanceWeight)
ClassificationError
(labels, nonNormalizedLogClassPosteriors)
MatrixL1Reg(matrix)
MatrixL2Reg(matrix)
SquareError (x, y)
Reducciones
ReduceSum
(z, axis=None)
ReduceLogSum
(z, axis=None)
ReduceMean
(z, axis=None)
ReduceMin
(z, axis=None)
ReduceMax
(z, axis=None)
CosDistance (x, y)
SumElements (z)
Operaciones de entrenamiento
BatchNormalization
(input, scale, bias, runMean, runInvStdDev, spatial, normalizationTimeConstant = 0, blendTimeConstant = 0, epsilon = 0.00001, useCntkEngine = true, imageLayout='CHW')
-
Dropout
(x)
Stabilize (x, enabled=true)
StabilizeElements (x, inputDim=x.dim, enabled=true)
CosDistanceWithNegativeSamples (x, y, numShifts, numNegSamples)
Volver a dar forma a las operaciones
CNTK2.Reshape (x, shape, beginAxis=0, endAxis=0)
ReshapeDimension (x, axis, shape) = CNTK2.Reshape (x, shape, beginAxis=axis, endAxis=axis + 1)
FlattenDimensions (x, axis, num) = CNTK2.Reshape (x, 0, beginAxis=axis, endAxis=axis + num)
SplitDimension (x, axis, N) = ReshapeDimension (x, axis, 0:N)
Slice (beginIndex, endIndex, input, axis=1)
BS.Sequences.First (x) = Slice (0, 1, x, axis=-1)
BS.Sequences.Last (x) = Slice (-1, 0, x, axis=-1)
Splice (inputs, axis=1)
TransposeDimensions (x, axis1, axis2)
Transpose (x) = TransposeDimensions (x, 1, 2)
BS.Sequences.BroadcastSequenceAs (type, data1)
BS.Sequences.Gather (where, x)
BS.Sequences.Scatter (where, y)
BS.Sequences.IsFirst (x)
BS.Sequences.IsLast (x)
Repetición
OptimizedRNNStack
(weights, input, hiddenDims, numLayers=1, bidirectional=false, recurrentOp='lstm')
BS.Loop.Previous (x, timeStep=1, defaultHiddenActivation=0)
PastValue (shape, x, defaultHiddenActivation=0.1, ...) = BS.Loop.Previous (0, shape, ...)
BS.Loop.Next (x, timeStep=1, defaultHiddenActivation=0)
FutureValue (shape, x, defaultHiddenActivation=0.1, ...) = BS.Loop.Next (0, shape, ...)
LSTMP (outputDim, cellDim=outputDim, x, inputDim=x.shape, aux=BS.Constants.None, auxDim=aux.shape, prevState, enableSelfStabilization=false)
BS.Boolean.Toggle (clk, initialValue=BS.Constants.False)
BS.RNNs.RecurrentLSTMP (outputDim, cellDim=outputDim, x, inputDim=x.shape, previousHook=BS.RNNs.PreviousHC, augmentInputHook=NoAuxInputHook, augmentInputDim=0, layerIndex=0, enableSelfStabilization=false)
BS.RNNs.RecurrentLSTMPStack (layerShapes, cellDims=layerShapes, input, inputShape=input.shape, previousHook=PreviousHC, augmentInputHook=NoAuxInputHook, augmentInputShape=0, enableSelfStabilization=false)
BS.RNNs.RecurrentBirectionalLSTMPStack (layerShapes, cellDims=layerShapes, input, inputShape=input.dim, previousHook=PreviousHC, nextHook=NextHC, enableSelfStabilization=false)
Compatibilidad con secuencia a secuencia
BS.Seq2Seq.CreateAugmentWithFixedWindowAttentionHook (attentionDim, attentionSpan, decoderDynamicAxis, encoderOutput, enableSelfStabilization=false)
BS.Seq2Seq.GreedySequenceDecoderFrom (modelAsTrained)
BS.Seq2Seq.BeamSearchSequenceDecoderFrom (modelAsTrained, beamDepth)
Operaciones de propósito especial
ClassBasedCrossEntropyWithSoftmax (labelClassDescriptorVectorSequence, mainInputInfo, mainWeight, classLogProbsBeforeSoftmax)
Edición de modelos
BS.Network.Load (pathName)
BS.Network.Edit (inputModel, editFunctions, additionalRoots)
BS.Network.CloneFunction (inputNodes, outputNodes, parameters="learnable" /*|"constant"|"shared"*/)
Otro
Fail (what)
IsSameObject (a, b)
Trace (node, say='', logFrequency=traceFrequency, logFirst=10, logGradientToo=false, onlyUpToRow=100000000, onlyUpToT=100000000, format=[])
Obsolescente
ErrorPrediction
(labels, nonNormalizedLogClassPosteriors)
ColumnElementTimes (...) = ElementTimes (...)
DiagTimes (...) = ElementTimes (...)
LearnableParameter(...) = Parameter(...)
LookupTable (embeddingMatrix, inputTensor)
RowRepeat (input, numRepeats)
RowSlice (beginIndex, numRows, input) = Slice(beginIndex, beginIndex + numRows, input, axis = 1)
RowStack (inputs)
RowElementTimes (...) = ElementTimes (...)
Scale (...) = ElementTimes (...)
ConstantTensor (scalarVal, shape)
Parameter (outputDim, inputDim, ...) = ParameterTensor ((outputDim:input), ...)
WeightParam (outputDim, inputDim) = Parameter (outputDim, inputDim, init='uniform', initValueScale=1, initOnCPUOnly=true, randomSeed=1)
DiagWeightParam (outputDim) = ParameterTensor ((outputDim), init='uniform', initValueScale=1, initOnCPUOnly=true, randomSeed=1)
BiasParam (dim) = ParameterTensor ((dim), init='fixedValue', value=0.0)
ScalarParam() = BiasParam (1)
SparseInput (shape, dynamicAxis='', tag='feature')
ImageInput (imageWidth, imageHeight, imageChannels, imageLayout='CHW', dynamicAxis='', tag='feature')
SparseImageInput (imageWidth, imageHeight, imageChannels, imageLayout='CHW', dynamicAxis='', tag='feature')
MeanVarNorm(feat) = PerDimMeanVarNormalization(feat, Mean (feat), InvStdDev (feat))
PerDimMeanVarNormalization (x, mean, invStdDev)
,
PerDimMeanVarDeNormalization (x, mean, invStdDev)
ReconcileDynamicAxis (dataInput, layoutInput)