• Skip to main content
  • Skip to primary sidebar

Damir Zunic

  • Home
  • Projects Portfolio
  • Certifications
  • Blog
  • About Me
  • Contact Me

Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization

This is the Course 2 of 5 in the Deep Learning Specialization, provided by Andrew Ng (DeppLearning.ai). With 37 videos (total about 5.5 hours) it opened the deep learning black box by helping me to understand the processes that drive performance and generate good results.

The course taught me:

  • the best practices to set up training, development and test sets
  • how to analyze bias/variance for building deep learning applications
  • how to use standard neural network techniques such as:
    • initialization
    • L2 and dropout regularization
    • hyperparameter tuning
    • batch normalization
    • gradient checking
  • how to implement and apply a variety of optimization algorithms and check for their convergence:
    • mini-batch gradient descent
    • Momentum
    • RMSprop
    • Adam
  • how to implement a neural network in TensorFlow

Credentials:

  • Certificate

Primary Sidebar

Search

More results...

Generic selectors
Exact matches only
Search in title
Search in content
Post Type Selectors

Follow Me

  • GitHub
  • LinkedIn

Recent Posts

  • Importance of Data Visualization
  • The Story About Confusion Matrix Visualization
  • My Data Journey

Contact me

  • Home
  • Projects Portfolio
  • Certifications
  • Blog
  • About Me
  • Contact Me

Copyright © 2020–2025 - damirzunic.com - Web Design By Damir Zunic