Tensorflow - Why are there so many similar or even duplicate functions in tf.nn and tf.layers / tf.losses / tf.contrib.layers etc? -
in tensorflow (as of v1.2.1), seems there (at least) 2 parallel apis construct computational graphs. there functions in tf.nn, conv2d, avg_pool, relu, dropout , there similar functions in tf.layers, tf.losses , elsewhere, tf.layers.conv2d, tf.layers.dense, tf.layers.dropout.
superficially, seems situation serves confuse: example, tf.nn.dropout uses 'keep rate' while tf.layers.dropout uses 'drop rate' argument.
does distinction have practical purpose end-user / developer? if not, there plan cleanup api?
tensorflow proposes on 1 hand low level api (tf.
, tf.nn.
...), , on other hand, higher level api (tf.layers.
, tf.losses.
,...).
the goal of higher level api provide functions simplify design of common neural nets. lower level api there people special needs, or wishes keep finer control of going on.
the situation bit confused though, because functions have same or similar names, , also, there no clear way distinguish @ first sight namespace correspond level of api.
now, let's @ conv2d
example. striking difference between tf.nn.conv2d
, tf.layers.conv2d
later takes care of variables needed weights , biases. single line of code, , voilĂ , created convolutional layer. tf.nn.conv2d
, have take declare weights variable before passing function. , biases, well, not handled: need add them later.
add that tf.layers.conv2d
proposes add regularization , activation in same function call, can imagine how can reduce code size when one's need covered higher-level api.
the higher level makes decisions default considered best practices. example, losses in tf.losses
added tf.graphkeys.losses
collection default, makes recovery , summation of various component easy , standardized. if use lower level api, need of yourself. obviously, need careful when start mixing low , high level api functions there.
the higher-level api answer great need people have been otherwise used high-level function in other frameworks, theano aside. rather obvious when 1 ponders number of alternative higher level apis built on top of tensorflow, such keras 2 (now developed exclusively tensorflow), slim (in tf.contrib.slim
), tflearn, tensorlayer, , likes.
Comments
Post a Comment