How does Tensorflow support using optimizers with custom ops? -
i've made new op , i'd use adamoptimizer. i've created gradient following instructions here , added optimizer's var_list tensorflow says variable doesn't have processor.
is there support tensorflow custom ops in optimizers? optimizer class let me create new processor or have rewrite part of compute_gradients?
also, automatic differentiation mean, stated tf docs:
to make automatic differentiation work new ops, must register gradient function computes gradients respect ops' inputs given gradients respect ops' outputs.
thanks!
so found out doing not supported tensorflow optimizer.
i trying create op act tensorflow variable (i.e. updated functions within optimizer::minimize()), however, believe tf weird processors , eigen::tensors don't understand in order update gradients minimize(), , naturally doesn't work op classes.
Comments
Post a Comment