Wednesday, February 26, 2025

I really liked the way you described the theory, the working principal, and the components of the LSTM-RNN cell. However, I was not able to understand the Python/TensorFlow implementation part. For example in the " build_model"  function (that you wrote),  is it just by adding the parameter "return_sequences = True" in the first layer - takes care of everything? How come in theory you mentioned of using various other activation functions such as sigmoid, tanh and a bunch of and/or gates (for doing multiplication and addition). But none of them were specified while building the model or neither while optimizing the modelšŸ¤”


No comments:

Post a Comment