Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization

4.86

Updated on

Course overview

Provider
Coursera
Course type
Free online course
Level
Intermediate
Deadline
Flexible
Duration
26 hours
Certificate
Paid Certificate Available
Course author
Andrew Ng

Description

In the second course of the Deep Learning Specialization, you will open the deep learning black box to understand the processes that drive performance and generate good results systematically. By the end, you will learn the best practices to train and develop test sets and analyze bias/variance for building deep learning applications; be able to use standard neural network techniques such as initialization, L2 and dropout regularization, hyperparameter tuning, batch normalization, and gradient checking; implement and apply a variety of optimization algorithms, such as mini-batch gradient descent, Momentum, RMSprop and Adam, and check for their convergence; and implement a neural network in TensorFlow. The Deep Learning Specialization is our foundational program that will help you understand the capabilities, challenges, and consequences of deep learning and prepare you to participate in the development of leading-edge AI technology. It provides a pathway for you to gain the knowledge and skills to apply machine learning to your work, level up your technical career, and take the definitive step in the world of AI.

Similar courses

Machine Learning
  • Flexible deadline
  • 61 hours
  • Certificate
Neural Networks and Deep Learning
  • Flexible deadline
  • 27 hours
  • Certificate
Introduction to Machine Learning in Production
  • Flexible deadline
  • 10 hours
  • Certificate
Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization
  • English language

  • Recommended provider

  • Certificate available