DeepLearning
GatedRecurrentUnitLayer
create GRU layer
Calling Sequence
Parameters
Description
Details
Compatibility
GatedRecurrentUnitLayer(units,opts)
GRULayer(units,opts)
units
-
positive integer
opts
(optional) one or more keyword options described below
GatedRecurrentUnitLayer(units,opts) creates a gated recurrent unit (GRU) neural network layer with units units.
GRULayer(units,opts) is equivalent to GatedRecurrentUnitLayer(units,opts).
This function is part of the DeepLearning package, so it can be used in the short form GatedRecurrentUnitLayer(..) only after executing the command with(DeepLearning). However, it can always be accessed through the long form of the command by using DeepLearning[GatedRecurrentUnitLayer](..).
The implementation of GRULayer uses tf.keras.layers.GRU from the TensorFlow Python API. Consult the TensorFlow Python API documentation for tf.keras.layers.GRU for more information.
The DeepLearning[GatedRecurrentUnitLayer] command was introduced in Maple 2021.
For more information on Maple 2021 changes, see Updates in Maple 2021.
See Also
DeepLearning Overview
Download Help Document
What kind of issue would you like to report? (Optional)