python - Keras Training warm_start -


is possible continue training keras estimator hyperparameters (including decreasing learning rate) , weights saved previous epochs, 1 in scikit-learn warm_start parameter? this:

estimator = kerasregressor(build_fn=create_model, epochs=20, batch_size=40, warm_start=true) 

specifically, warm start should this:

warm_start : bool, optional, default false when set true, reuse solution of previous call fit initialization, otherwise, erase previous solution.

is there in keras?

yes - it's possible. rather cumbersome. need use train_on_batch function keeps model parameters (also optimizer ones).

this cumbersome because need divide dataset batches on own , losing possibility apply callbacks , use automatic progbar. hope in new keras version option added fit method.


Comments

Popular posts from this blog

node.js - Node js - Trying to send POST request, but it is not loading javascript content -

javascript - Replicate keyboard event with html button -

javascript - Web audio api 5.1 surround example not working in firefox -