Change the Learning Rate of the Adam Optimizer on a Keras Network

Chris Achard
InstructorChris Achard
Share this video with your friends

Social Share Links

Send Tweet
Published 6 years ago
Updated 5 years ago

We can specify several options on a network optimizer, like the learning rate and decay, so we’ll investigate what effect those have on training time and accuracy. Each data sets may respond differently, so it’s important to try different optimizer settings to find one that properly trades off training time vs accuracy for your data.

Instructor: [00:00] We're using the Adam optimizer for the network which has a default learning rate of .001. To change that, first import Adam from keras.optimizers. Then, instead of just saying we're going to use the Adam optimizer, we can create a new instance of the Adam optimizer, and use that instead of a string to set the optimizer.

[00:27] One of the optional parameters is lr, which stands for learning rate. Setting the learning rate is like saying how large the steps are that the network takes while learning. If the learning rate is too small, the network will never have a chance to get where it's going. The accuracy will always be low, or training will take a really long time.

[00:57] If the learning rate is too large, then the network will jump all over the place and will never be able to find the best solution, because it will keep jumping over it. Let's set a learning rate of .005 and see how that works for our problem set.

[01:15] After 100 EPoX, we're seeing good progress on the training in validation accuracy, so we'll keep that learning rate. It's important to test different learning rates for your network, because each dataset in network topology will respond slightly differently to different learning rates.

egghead
egghead
~ 46 minutes ago

Member comments are a way for members to communicate, interact, and ask questions about a lesson.

The instructor or someone from the community might respond to your question Here are a few basic guidelines to commenting on egghead.io

Be on-Topic

Comments are for discussing a lesson. If you're having a general issue with the website functionality, please contact us at support@egghead.io.

Avoid meta-discussion

  • This was great!
  • This was horrible!
  • I didn't like this because it didn't match my skill level.
  • +1 It will likely be deleted as spam.

Code Problems?

Should be accompanied by code! Codesandbox or Stackblitz provide a way to share code and discuss it in context

Details and Context

Vague question? Vague answer. Any details and context you can provide will lure more interesting answers!

Markdown supported.
Become a member to join the discussionEnroll Today