DeepLearning/GatedRecurrentUnitLayer - Maple Help

Online Help

All Products    Maple    MapleSim


Home : Support : Online Help : DeepLearning/GatedRecurrentUnitLayer

DeepLearning

  

GatedRecurrentUnitLayer

  

create GRU layer

 

Calling Sequence

Parameters

Description

Details

Compatibility

Calling Sequence

GatedRecurrentUnitLayer(units,opts)

GRULayer(units,opts)

Parameters

units

-

positive integer

opts

-

(optional) one or more keyword options described below

Description

• 

GatedRecurrentUnitLayer(units,opts) creates a gated recurrent unit (GRU) neural network layer with units units.

• 

GRULayer(units,opts) is equivalent to GatedRecurrentUnitLayer(units,opts).

• 

This function is part of the DeepLearning package, so it can be used in the short form GatedRecurrentUnitLayer(..) only after executing the command with(DeepLearning). However, it can always be accessed through the long form of the command by using DeepLearning[GatedRecurrentUnitLayer](..).

Details

• 

The implementation of GRULayer uses tf.keras.layers.GRU from the TensorFlow Python API. Consult the TensorFlow Python API documentation for tf.keras.layers.GRU for more information.

Compatibility

• 

The DeepLearning[GatedRecurrentUnitLayer] command was introduced in Maple 2021.

• 

For more information on Maple 2021 changes, see Updates in Maple 2021.

See Also

DeepLearning Overview