[theano-users] Re: How to implement binary activation in theano?

2017-07-12 Thread zxzhijia
I see. I'll try that. Thanks. On Wednesday, July 12, 2017 at 11:57:57 AM UTC-8, Jesse Livezey wrote: > > Do you need to take derivatives through the activation? If not, then you > could use switch, i.e. > > x = some theano variable > threshold = .5 > x_binary = T.switch(x > theshold, 1., 0.) > >

[theano-users] Re: About theano.function inside for loop

2017-07-12 Thread Jesse Livezey
Yes, you should be able to just call theano.function(...) before the loops. On Wednesday, July 12, 2017 at 4:13:33 AM UTC-7, Kelvin Chiu wrote: > > for x in range(x_range): > for y in range(y_range): > t_test_set_x = theano_translation(test_set_x, x, y, borrow=True) > predict_m

[theano-users] Re: How to implement binary activation in theano?

2017-07-12 Thread Jesse Livezey
Do you need to take derivatives through the activation? If not, then you could use switch, i.e. x = some theano variable threshold = .5 x_binary = T.switch(x > theshold, 1., 0.) On Wednesday, July 12, 2017 at 10:27:32 AM UTC-7, zxzh...@gmail.com wrote: > > In the binarized network github code ()

[theano-users] How to implement binary activation in theano?

2017-07-12 Thread zxzhijia
In the binarized network github code (), Matthieu used stochastic binarization. I'm wondering how to define just a simple binary activation instead of stochastic in theano? -- --- You received this message because you are subscribed to the Google Groups "theano-users" group. To unsubscribe f

[theano-users] About theano.function inside for loop

2017-07-12 Thread Chiu Chun Pang
for x in range(x_range): for y in range(y_range): t_test_set_x = theano_translation(test_set_x, x, y, borrow=True) predict_model = theano.function(inputs=[index], outputs=layer3.errors(y), givens={

[theano-users] Re: Implementing a GPU op

2017-07-12 Thread Christopher Bourez
What surprises me is to get seg faults in the theano function, while I would have expected them to occur during evaluation on values... On Wednesday, July 12, 2017 at 10:05:30 AM UTC+2, Christopher Bourez wrote: > > A second thing that is not clear to me in the documentation of Theano is > how y

[theano-users] Re: Implementing a GPU op

2017-07-12 Thread Christopher Bourez
A second thing that is not clear to me in the documentation of Theano is how you specify a C implementation and GPU implementation of the same own op. Thank you On Wednesday, July 12, 2017 at 9:58:34 AM UTC+2, Christopher Bourez wrote: > > I've also tried to create an example with theano.gpuarra

[theano-users] Re: Implementing a GPU op

2017-07-12 Thread Christopher Bourez
I've also tried to create an example with theano.gpuarray.nnet.GpuSoftmax but after compilation it got replaced another implementation*GpuDnnSoftmax : * *Elemwise{mul,no_inplace} [id A] ''|HostFromGpu(gpuarray) [id B] '' | |GpuSoftmax [id C] ''| |GpuFromHost [id D] ''| |x

[theano-users] Re: Implementing a GPU op

2017-07-12 Thread Christopher Bourez
I don't know what you mean by "not modifying" the source for GpuEye: - In this example, I'm importing a not modifyed GpuEye op from Theano basic ops - If I'm using theano.tensor.eye, then it does not use the GpuEye Also, are you sure this test https://github.com/Theano/Theano/blob/2625464534147f