# Deep-Learning-Specialization-Coursera **Repository Path**: Rroscha/Deep-Learning-Specialization-Coursera ## Basic Information - **Project Name**: Deep-Learning-Specialization-Coursera - **Description**: Deep Learning Specialization courses by Andrew Ng, deeplearning.ai - **Primary Language**: Unknown - **License**: Not specified - **Default Branch**: master - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2022-01-08 - **Last Updated**: 2022-01-08 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README # Coursera Deep Learning Specialization This is the repository for my implementations on the Deep Learning Specialization from Coursera. Taught by [Andrew Ng](http://www.andrewng.org/) ### [Syllabus](https://www.coursera.org/specializations/deep-learning) ## Course 1. [Neural Networks and Deep Learning](https://www.coursera.org/learn/neural-networks-deep-learning) Foundations of Deep Learning: * Understand the major technology trends driving Deep Learning * Be able to build, train and apply fully connected deep neural networks * Know how to implement efficient (vectorized) neural networks * Understand the key parameters in a neural network's architecture Codes: * Week2: [Neural Network Basics](https://github.com/AdalbertoCq/Deep-Learning-Specialization-Coursera/tree/master/Neural%20Networks%20and%20Deep%20Learning/week2) * Week3: [Shallow Neural Network Implementation](https://github.com/AdalbertoCq/Deep-Learning-Specialization-Coursera/tree/master/Neural%20Networks%20and%20Deep%20Learning/week3) * Week4: [Deep Neural Network Implementation](https://github.com/AdalbertoCq/Deep-Learning-Specialization-Coursera/tree/master/Neural%20Networks%20and%20Deep%20Learning/week4) * Mathematical demonstration: [Backpropagation](https://github.com/AdalbertoCq/Deep-Learning-Specialization-Coursera/blob/master/Neural%20Networks%20and%20Deep%20Learning/backprop.PDF) * Mathematical demonstration: [Cross-entropy & Softmax gradients](https://github.com/AdalbertoCq/Deep-Learning-Specialization-Coursera/blob/master/Neural%20Networks%20and%20Deep%20Learning/cross_entropy_softmax.PDF) ## Course 2. [Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization](https://www.coursera.org/learn/deep-neural-network) * Understand industry best-practices for building deep learning applications. * Be able to effectively use the common neural network "tricks", including initialization, L2 and dropout regularization, Batch normalization, gradient checking, * Be able to implement and apply a variety of optimization algorithms, such as mini-batch gradient descent, Momentum, RMSprop and Adam, and check for their convergence. * Understand new best-practices for the deep learning era of how to set up train/dev/test sets and analyze bias/variance * Be able to implement a neural network in TensorFlow. Codes: * Week1: [Initialization, Regularization & Gradient Check](https://github.com/AdalbertoCq/Deep-Learning-Specialization-Coursera/tree/master/Improving%20Deep%20Neural%20Networks/week1) * Week2: [Optimization Algorithms](https://github.com/AdalbertoCq/Deep-Learning-Specialization-Coursera/tree/master/Improving%20Deep%20Neural%20Networks/week2) * Week3: [Hyperparameter tuning, Batch Normalization & Tensorflow Implementation](https://github.com/AdalbertoCq/Deep-Learning-Specialization-Coursera/tree/master/Improving%20Deep%20Neural%20Networks/week3) * Mathematical demostration: [Batch Normalization Gradient](https://github.com/AdalbertoCq/Deep-Learning-Specialization-Coursera/blob/master/Improving%20Deep%20Neural%20Networks/batch_norm_backprop.PDF) * Paper: [Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift](https://arxiv.org/pdf/1502.03167.pdf) ## Course 3. [Structuring Machine Learning Projects](https://www.coursera.org/learn/machine-learning-projects) - Understand how to diagnose errors in a machine learning system, and - Be able to prioritize the most promising directions for reducing error - Understand complex ML settings, such as mismatched training/test sets, and comparing to and/or surpassing human-level performance - Know how to apply end-to-end learning, transfer learning, and multi-task learning ## Course 4. [Convolutional Neural Networks](https://www.coursera.org/learn/convolutional-neural-networks) * Understand how to build a convolutional neural network, including recent variations such as residual networks. * Know how to apply convolutional networks to visual detection and recognition tasks. * Know to use neural style transfer to generate art. * Be able to apply these algorithms to a variety of image, video, and other 2D or 3D data. Codes: * Week1: [Convolutional Neural Network Implementation in Numpy](https://github.com/AdalbertoCq/Deep-Learning-Specialization-Coursera/tree/master/Convolutional%20Neural%20Networks/week1) * Week2: * [Keras CNN Implementation](https://github.com/AdalbertoCq/Deep-Learning-Specialization-Coursera/tree/master/Convolutional%20Neural%20Networks/week2/Keras%20tutorial) * [ResNet Keras Implementation](https://github.com/AdalbertoCq/Deep-Learning-Specialization-Coursera/tree/master/Convolutional%20Neural%20Networks/week2/ResNet) * Paper: [Deep Residual Learning for Image Recognition](https://arxiv.org/abs/1512.03385) * Week3: * [YOLO Implementation(You Only Look Once)](https://github.com/AdalbertoCq/Deep-Learning-Specialization-Coursera/tree/master/Convolutional%20Neural%20Networks/week3) * Paper: [You Only Look Once: Unified, Real-Time Object Detection](https://arxiv.org/abs/1506.02640) * Paper: [YOLO9000: Better, Faster, Stronger](https://arxiv.org/abs/1612.08242) * Week4: * [Neural Style Transfer](https://github.com/AdalbertoCq/Deep-Learning-Specialization-Coursera/tree/master/Convolutional%20Neural%20Networks/week4/Art%20Neural%20Transfer) * Paper: [A Neural Algorithm of Artistic Style](https://arxiv.org/abs/1508.06576) * [Face Recognition](https://github.com/AdalbertoCq/Deep-Learning-Specialization-Coursera/tree/master/Convolutional%20Neural%20Networks/week4/Face%20Recognition) * Paper: [FaceNet: A Unified Embedding for Face Recognition and Clustering](https://arxiv.org/abs/1503.03832) * Paper: [Going deeper with convolutions (Inception Networks)](https://arxiv.org/abs/1409.4842) ## Course 5. [Sequential Models](https://www.coursera.org/learn/nlp-sequence-models) * Understand how to build and train Recurrent Neural Networks (RNNs), and commonly-used variants such as GRUs and LSTMs. * Be able to apply sequence models to natural language problems, including text synthesis. * Be able to apply sequence models to audio applications, including speech recognition and music synthesis. Codes: * Week1: * [RNN & LSTM Implementation in Numpy (Including backpropagation)](https://github.com/AdalbertoCq/Deep-Learning-Specialization-Coursera/tree/master/Sequence%20Models/week1/Building%20a%20RNN) * Mathematical demonstration: [RNN gradient through time](https://github.com/AdalbertoCq/Deep-Learning-Specialization-Coursera/blob/master/Sequence%20Models/rnn_through_time_backprop.PDF) * Mathematical demonstration: [LSTM gradient through time](https://github.com/AdalbertoCq/Deep-Learning-Specialization-Coursera/blob/master/Sequence%20Models/lstm_through_time_backprop.PDF) * Mathematical demonstration: [GRU gradient through time](https://github.com/AdalbertoCq/Deep-Learning-Specialization-Coursera/blob/master/Sequence%20Models/gru_through_time_backprop.pdf) * Paper: [Vanishing/Exploding gradient & Clipping](http://proceedings.mlr.press/v28/pascanu13.pdf) * [Character-Level Language Modeling](https://github.com/AdalbertoCq/Deep-Learning-Specialization-Coursera/tree/master/Sequence%20Models/week1/Character-level%20language%20model) * [Sequence Sampling Generation LSTM](https://github.com/AdalbertoCq/Deep-Learning-Specialization-Coursera/tree/master/Sequence%20Models/week1/LSTM%20Network) * Week2: * [Natural Language Processing & Word Embeddings](https://github.com/AdalbertoCq/Deep-Learning-Specialization-Coursera/tree/master/Sequence%20Models/week2/Emojyfier) * [Operations on word vectors - Debiasing](https://github.com/AdalbertoCq/Deep-Learning-Specialization-Coursera/tree/master/Sequence%20Models/week2/Operation%20on%20word%20vectors) * Week3: * [Neural Machine Translation with Attention](https://github.com/AdalbertoCq/Deep-Learning-Specialization-Coursera/tree/master/Sequence%20Models/week3/Neural%20machine%20translation%20with%20attention) * [Trigger word detection](https://github.com/AdalbertoCq/Deep-Learning-Specialization-Coursera/tree/master/Sequence%20Models/week3/Trigger%20word%20detection)