How does Tensorflow support using optimizers with custom ops? -


i've made new op , i'd use adamoptimizer. i've created gradient following instructions here , added optimizer's var_list tensorflow says variable doesn't have processor.

is there support tensorflow custom ops in optimizers? optimizer class let me create new processor or have rewrite part of compute_gradients?

also, automatic differentiation mean, stated tf docs:

to make automatic differentiation work new ops, must register gradient function computes gradients respect ops' inputs given gradients respect ops' outputs.

thanks!

so found out doing not supported tensorflow optimizer.

i trying create op act tensorflow variable (i.e. updated functions within optimizer::minimize()), however, believe tf weird processors , eigen::tensors don't understand in order update gradients minimize(), , naturally doesn't work op classes.


Comments

Popular posts from this blog

node.js - Node js - Trying to send POST request, but it is not loading javascript content -

javascript - Replicate keyboard event with html button -

javascript - Web audio api 5.1 surround example not working in firefox -