Udacity Deep Learning Nanodegree

6 minute read


I completed Udacity’s Deep Learning Foundation Nanodegree Program. In this post I will share and discuss my projects that I completed. I will also share resources I found useful for certain topics within deep learning (specifically topics covered in the nanodgree). This course was broken up into 4 sections, Neural Networks, Convolutional Neural Networks, Recurrent Neural Networks, and Generative Adversarial Networks.

All code will be uploaded to github soon!

Preliminary Information

Neural Networks

  • Overview: In this section we had an intro to the course and got introduced to platforms such as Ancaonda, tools like Jupyter Notebooks, and libraries like Pandas, Scikit-learn, and Matplotlib. We also looked at real applications of deep learning by exploring some already working code. We looked at Style Transfer, DeepTraffic, and Flappy Bird. In this section we were also recommended books (Grokking Deep Learning, Neural Networks and Deep Learning, Deep Learning - links below). We also looked into basic concepts like regression, matrix math, and numpy. After all of the intro we finally explored neural network basics which included studying the perceptron, gradient descent, multilayer perceptrons (fully connected network), and backpropagation.
  • Resources:
  • Project 1 'Your First Neural Network'
    • This description is copied directly from Udacity: 'In this project, you'll get to build a neural network from scratch to carry out a prediction problem on a real dataset! By building a neural network from the ground up, you'll have a much better understanding of gradient descent, backpropagation, and other concepts that are important to know before we move to higher level tools such as Tensorflow. You'll also get to see how to apply these networks to solve real prediction problems!'
    • link to code coming soon

Convolutional Neural Networks

  • Overview: In this section we got introduced into concepts like model evaluation and validation. Then we had a slight tangent and discussed Sentiment Analysis with Andrew Trask. We were then introduced int TFLearn, Cloud Computing (using AWS), Keras, and had an introduction to TensorFlow. After covering the basics we dove into Deep Neural Networks and learned the basics of CNNs and their components. We looked into parameter sharing, filters (padding, stride, width, height depth) and basics of convolution. We also explore concepts like max pooling and RELU activations. Finally, we explored image classification.
  • Resources:
  • Project 2 'Image Classification'
    • This description is copied directly from Udacity: In this project, you'll classify images from the CIFAR-10 dataset. The dataset consists of airplanes, dogs, cats, and other objects. The dataset will need to be preprocessed, then train a convolutional neural network on all the samples. You'll normalize the images, one-hot encode the labels, build a convolutional layer, max pool layer, and fully connected layer. At then end, you'll see their predictions on the sample images.
    • link to code coming soon

Recurrent Neural Networks

Generative Adversarial Networks