I am following the examples in tensor flow.jl I get all the examples to work
nicely. however, when I want to change an activation function to tan or sigmoid
I get an error. I suspect I am doing something wrong rather than a problem with
the package so I ask here...
If I do this:
using TensorFlow
using TensorFlow.Train
using TensorFlow.InputData
import TensorFlow: DT_FLOAT32
import TensorFlow.API: relu, tanh, sigmoid,softmax_cross_entropy_with_logits,
AdamOptimizer, arg_max, equal, cast
methods(relu)
I get:
2 methods for generic function relu:
relu(features::Union{TensorFlow.CoreTypes.AbstractTensor,Void}) at
/Users/Raukhur/.julia/v0.4/TensorFlow/src/API/TfNn.jl:1006
relu(features::Union{TensorFlow.CoreTypes.AbstractTensor,Void},
name::Union{AbstractString,Void}) at
/Users/Raukhur/.julia/v0.4/TensorFlow/src/API/TfNn.jl:1006
but if I do this
methods(tanh)
I get 13 methods for tanh but none for tensor flow related types and for
methods(sigmoid) I get
LoadError: UndefVarError: sigmoid not defined
while loading In[42], in expression starting on line 6
So it seems relu gets imported but not tanh or sigmoid. also
methods(TensorFlow.API.tanh) nor
methods(TensorFlow.tanh) works
Its probably something simple I have overlooked....