 BatchNormalizationLayer - Maple Help

DeepLearning

 BatchNormalizationLayer
 create batch normalization layer Calling Sequence BatchNormalizationLayer(dim, opts) Parameters

 dim - positive integer opts - one or more options as specified below Options

 • center : truefalse
 If true, the learned offset beta will be added to the normalized tensor. The default is true.
 • epsilon : numeric
 Specifies a small real number number to be added to the variance to avoid division by zero. Default is 0.01.
 • inputshape : list of integers
 Shape of the input Tensor, not including the batch axis.
 With the default value auto, the shape is inferred. If inference is not possible, an error is issued.
 This option need only be specified when this layer is the first in a Sequential model.
 • momentum : numeric
 Specifies the momentum for the moving average. Default is 0.99.
 • scale : truefalse
 If true, the learned scale factor gamma will be multiplied by the normalized tensor. The default is true. Description

 • The BatchNormalizationLayer(dim, opts) command creates a batch normalization neural network layer.
 • This function is part of the DeepLearning package, so it can be used in the short form BatchNormalizationLayer(..) only after executing the command with(DeepLearning). However, it can always be accessed through the long form of the command by using DeepLearning[BatchNormalizationLayer](..). Details

 • The implementation of BatchNormalizationLayer uses tf.keras.layers.BatchNormalization from the TensorFlow Python API. Consult the TensorFlow Python API documentation for tf.keras.layers.BatchNormalization for more information. Examples

 > $\mathrm{with}\left(\mathrm{DeepLearning}\right)$
 $\left[{\mathrm{AddMultiple}}{,}{\mathrm{ApplyOperation}}{,}{\mathrm{BatchNormalizationLayer}}{,}{\mathrm{BidirectionalLayer}}{,}{\mathrm{BucketizedColumn}}{,}{\mathrm{CategoricalColumn}}{,}{\mathrm{Classify}}{,}{\mathrm{Concatenate}}{,}{\mathrm{Constant}}{,}{\mathrm{ConvolutionLayer}}{,}{\mathrm{DNNClassifier}}{,}{\mathrm{DNNLinearCombinedClassifier}}{,}{\mathrm{DNNLinearCombinedRegressor}}{,}{\mathrm{DNNRegressor}}{,}{\mathrm{Dataset}}{,}{\mathrm{DenseLayer}}{,}{\mathrm{DropoutLayer}}{,}{\mathrm{EinsteinSummation}}{,}{\mathrm{EmbeddingLayer}}{,}{\mathrm{Estimator}}{,}{\mathrm{FeatureColumn}}{,}{\mathrm{Fill}}{,}{\mathrm{FlattenLayer}}{,}{\mathrm{GRULayer}}{,}{\mathrm{GatedRecurrentUnitLayer}}{,}{\mathrm{GetDefaultGraph}}{,}{\mathrm{GetDefaultSession}}{,}{\mathrm{GetEagerExecution}}{,}{\mathrm{GetVariable}}{,}{\mathrm{GradientTape}}{,}{\mathrm{IdentityMatrix}}{,}{\mathrm{LSTMLayer}}{,}{\mathrm{Layer}}{,}{\mathrm{LinearClassifier}}{,}{\mathrm{LinearRegressor}}{,}{\mathrm{LongShortTermMemoryLayer}}{,}{\mathrm{MaxPoolingLayer}}{,}{\mathrm{Model}}{,}{\mathrm{NumericColumn}}{,}{\mathrm{OneHot}}{,}{\mathrm{Ones}}{,}{\mathrm{Operation}}{,}{\mathrm{Optimizer}}{,}{\mathrm{Placeholder}}{,}{\mathrm{RandomTensor}}{,}{\mathrm{ResetDefaultGraph}}{,}{\mathrm{Restore}}{,}{\mathrm{Save}}{,}{\mathrm{Sequential}}{,}{\mathrm{Session}}{,}{\mathrm{SetEagerExecution}}{,}{\mathrm{SetRandomSeed}}{,}{\mathrm{SoftMaxLayer}}{,}{\mathrm{SoftmaxLayer}}{,}{\mathrm{Tensor}}{,}{\mathrm{Variable}}{,}{\mathrm{Variables}}{,}{\mathrm{VariablesInitializer}}{,}{\mathrm{Zeros}}\right]$ (1)
 > $\mathrm{model}≔\mathrm{Sequential}\left(\left[\mathrm{DenseLayer}\left(3\right),\mathrm{BatchNormalizationLayer}\left(2\right)\right]\right)$
 ${\mathrm{model}}{≔}\left[\begin{array}{c}{\mathrm{DeepLearning Model}}\\ {\mathrm{}}\end{array}\right]$ (2)
 > $\mathrm{model}:-\mathrm{Compile}\left(\right)$ Compatibility

 • The DeepLearning[BatchNormalizationLayer] command was introduced in Maple 2022.