Long Short Term Memory (LSTM) and Gated Recurrent Units (GRU) are two layer types commonly used to build recurrent neural networks in Keras. This video introduces these two network types as a foundation towards Natural Language Processing (NLP) and time series prediction.
Code for This Video:
https://github.com/jeffheaton/t81_558...
Course Homepage: https://sites.wustl.edu/jeffheaton/t8...
Follow Me/Subscribe:
/ heatonresearch
https://github.com/jeffheaton
/ jeffheaton
Support Me on Patreon: / jeffheaton