I see. I'll try that. Thanks.
On Wednesday, July 12, 2017 at 11:57:57 AM UTC-8, Jesse Livezey wrote:
>
> Do you need to take derivatives through the activation? If not, then you
> could use switch, i.e.
>
> x = some theano variable
> threshold = .5
> x_binary = T.switch(x > theshold, 1., 0.)
>
>
Yes, you should be able to just call theano.function(...) before the loops.
On Wednesday, July 12, 2017 at 4:13:33 AM UTC-7, Kelvin Chiu wrote:
>
> for x in range(x_range):
> for y in range(y_range):
> t_test_set_x = theano_translation(test_set_x, x, y, borrow=True)
> predict_m
Do you need to take derivatives through the activation? If not, then you
could use switch, i.e.
x = some theano variable
threshold = .5
x_binary = T.switch(x > theshold, 1., 0.)
On Wednesday, July 12, 2017 at 10:27:32 AM UTC-7, zxzh...@gmail.com wrote:
>
> In the binarized network github code ()
In the binarized network github code (), Matthieu used stochastic
binarization. I'm wondering how to define just a simple binary activation
instead of stochastic in theano?
--
---
You received this message because you are subscribed to the Google Groups
"theano-users" group.
To unsubscribe f
for x in range(x_range):
for y in range(y_range):
t_test_set_x = theano_translation(test_set_x, x, y, borrow=True)
predict_model = theano.function(inputs=[index],
outputs=layer3.errors(y),
givens={
What surprises me is to get seg faults in the theano function, while I
would have expected them to occur during evaluation on values...
On Wednesday, July 12, 2017 at 10:05:30 AM UTC+2, Christopher Bourez wrote:
>
> A second thing that is not clear to me in the documentation of Theano is
> how y
A second thing that is not clear to me in the documentation of Theano is
how you specify a C implementation and GPU implementation of the same own
op. Thank you
On Wednesday, July 12, 2017 at 9:58:34 AM UTC+2, Christopher Bourez wrote:
>
> I've also tried to create an example with theano.gpuarra
I've also tried to create an example with theano.gpuarray.nnet.GpuSoftmax but
after compilation it got replaced another implementation*GpuDnnSoftmax : *
*Elemwise{mul,no_inplace} [id A] ''|HostFromGpu(gpuarray) [id B] ''
| |GpuSoftmax [id C] ''| |GpuFromHost [id D] ''| |x
I don't know what you mean by "not modifying" the source for GpuEye:
- In this example, I'm importing a not modifyed GpuEye op from Theano
basic ops
- If I'm using theano.tensor.eye, then it does not use the GpuEye
Also, are you sure this test
https://github.com/Theano/Theano/blob/2625464534147f