Be a sequence-processing layer (accepts 3D+ inputs).
Have a go_backwards, return_sequences and return_state
attribute (with the same semantics as for the RNN class).
Have an input_spec attribute.
Implement serialization via get_config() and from_config().
Note that the recommended way to create new RNN layers is to write a
custom RNN cell and use it with keras.layers.RNN, instead of
subclassing keras.layers.Layer directly.
When return_sequences is True, the output of the masked
timestep will be zero regardless of the layer's original
zero_output_for_mask value.
merge_mode
Mode by which outputs of the forward and backward RNNs
will be combined. One of {"sum", "mul", "concat", "ave", None}.
If None, the outputs will not be combined,
they will be returned as a list. Defaults to "concat".
backward_layer
Optional keras.layers.RNN,
or keras.layers.Layer instance to be used to handle
backwards input processing.
If backward_layer is not provided, the layer instance passed
as the layer argument will be used to generate the backward layer
automatically.
Note that the provided backward_layer layer should have properties
matching those of the layer argument, in particular
it should have the same values for stateful, return_states,
return_sequences, etc. In addition, backward_layer
and layer should have different go_backwards argument values.
A ValueError will be raised if these requirements are not met.
Call arguments
The call arguments for this layer are the same as those of the
wrapped RNN layer. Beware that when passing the initial_state
argument during the call of this layer, the first half in the
list of elements in the initial_state list will be passed to
the forward RNN call and the last half in the list of elements
will be passed to the backward RNN call.
Examples:
model=Sequential([Input(shape=(5,10)),Bidirectional(LSTM(10,return_sequences=True),Bidirectional(LSTM(10)),Dense(5,activation="softmax"),])model.compile(loss='categorical_crossentropy',optimizer='rmsprop')# With custom backward layerforward_layer=LSTM(10,return_sequences=True)backward_layer=LSTM(10,activation='relu',return_sequences=True,go_backwards=True)model=Sequential([Input(shape=(5,10)),Bidirectional(forward_layer,backward_layer=backward_layer),Dense(5,activation="softmax"),])model.compile(loss='categorical_crossentropy',optimizer='rmsprop')
Attributes
input
Retrieves the input tensor(s) of a symbolic operation.
Only returns the tensor(s) corresponding to the first time
the operation was called.
output
Retrieves the output tensor(s) of a layer.
Only returns the tensor(s) corresponding to the first time
the operation was called.
This method is the reverse of get_config,
capable of instantiating the same layer from the config
dictionary. It does not handle layer connectivity
(handled by Network), nor weights (handled by set_weights).
Args
config
A Python dictionary, typically the
output of get_config.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2024-06-07 UTC."],[],[]]