# Recurrent-Neural-Networks **Repository Path**: davidgao7/Recurrent-Neural-Networks ## Basic Information - **Project Name**: Recurrent-Neural-Networks - **Description**: No description available - **Primary Language**: Unknown - **License**: Not specified - **Default Branch**: main - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2022-02-11 - **Last Updated**: 2022-02-11 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README # Recurrent Neural Networks If github in unable to render a Jupyter notebook, copy the link of the notebook and enter into the nbviewer: https://nbviewer.jupyter.org/ These notebooks provide an introduction to the **Recurrent Neural Networks (RNNs)**. - Notebook 1: Motivation, RNN architecture, Signal propagation in RNNs, RNN architectures - Notebook 2: RNN Training - Backpropagation through time (BPTT) - Notebook 3: RNN Training issues - Vanishing & Exploding gradient problem, Preventing the problem: activation, layer normalization, gradient clipping - Notebook 4: Gated RNN Cells, GRU, LSTM - Effective strategy to resolve the vanishing gradient problem in RNNs - Notebook 5: Building effective RNNs, Increase The Representational Power of RNNs, Preventing Overfitting in RNNs The notebooks are partially adapted from - Hands-On Machine Learning with Scikit-Learn, Keras and TensorFlow: Concepts, Tools, and Techniques to Build Intelligent Systems (2nd Edition) by Aurélien Géron - Dive into Deep Learning by Aston Zhang, Zachary C. Lipton, Mu Li, and Alexander J. Smola