python - Keras Training warm_start -
is possible continue training keras estimator hyperparameters (including decreasing learning rate) , weights saved previous epochs, 1 in scikit-learn warm_start
parameter? this:
estimator = kerasregressor(build_fn=create_model, epochs=20, batch_size=40, warm_start=true)
specifically, warm start should this:
warm_start : bool, optional, default false when set true, reuse solution of previous call fit initialization, otherwise, erase previous solution.
is there in keras?
yes - it's possible. rather cumbersome. need use train_on_batch
function keeps model parameters (also optimizer ones).
this cumbersome because need divide dataset batches on own , losing possibility apply callbacks
, use automatic progbar
. hope in new keras
version option added fit
method.
Comments
Post a Comment