Re: [theano-users] Separate 2d convolution over different channels

2017-10-19 Thread Peter O'Connor
Hi. Did anyone figure this out? I really need this op. In my case I want to compute the per-channel cross correlation of two (512, 20, 20) feature maps. Currently, I can't see any way to do this without using scan. My current solution is: class ChannelwiseCrossCorr(object): def __init

[theano-users] Re: Unable to use GPU even though CUDA is installed and GPU card is detected

2017-07-28 Thread Peter O'Connor
NICE! Yes, this also fixed it for me. -- --- You received this message because you are subscribed to the Google Groups "theano-users" group. To unsubscribe from this group and stop receiving emails from it, send an email to theano-users+unsubscr...@googlegroups.com. For more options, visit ht

[theano-users] Re: New to theano. Trying to add a term to a loss function to penalize large weights

2017-01-18 Thread Peter O'Connor
Unless I'm mistaken it seems like theano.tensor.sum(theano.tensor.exp(-10 * ws)) just encourages weights to become more positive. Why not use a L2 weight penalty like theano.tensor.sum(w**2) ? So the full loss would become: crossentropy_categorical_1hot(coding_dist, true_dist) + sum((layer.w**

[theano-users] Re: New to theano. Trying to add a term to a loss function to penalize large weights

2017-01-18 Thread Peter O'Connor
Unless I'm mistaken it seems like theano.tensor.sum(theano.tensor.exp(-10 * ws)) just encourages weights to become more positive. Why not use a L2 weight penalty like theano.tensor.sum(w**2) ? So the full loss would become: crossentropy_categorical_1hot(coding_dist, true_dist) + sum((l_layers[

Re: [theano-users] Scan: Return tuple of length 1?

2017-01-18 Thread Peter O'Connor
Hi Fred, thanks for your answer. It seems to me that scan should just return data in the format of the inner function (except with an added dimension to each tensor). The "return_list" flag feels like an unnecessary workaround. If the function had been def mul_by_2(data): return

[theano-users] Scan: Return tuple of length 1?

2017-01-16 Thread Peter O'Connor
Hi all. There's some really unintuitive/inconsistent behaviour with theano's scan and I'm wondering how I should work around it. Take the following test, which I would expect to pass: import theano import theano.tensor as tt def test_scan_tuple_output(): x = tt.matrix() def mul_by_2

[theano-users] Scan: Return tuple of length 1?

2017-01-16 Thread Peter O'Connor
Hi all. There's some really unintuitive/inconsistent behaviour with theano's scan and I'm wondering how I should work around it. Take the following test, which I would expect to pass: import theano import theano.tensor as tt def test_scan_tuple_output(): x = tt.matrix() def mul_by_2

[theano-users] Return a tuple of length-1 from scan?

2017-01-16 Thread Peter O'Connor
Hi all. There's some really unintuitive/inconsistent behaviour with theano's scan and I'm wondering how I should work around it. Take the following test, which I would expect to pass: import theano import theano.tensor as tt def test_scan_tuple_output(): x = tt.matrix() def mul_by_2

[theano-users] Re: Dynamically sized shared variable in scan loop:

2017-01-02 Thread Peter O'Connor
And the full trace: Traceback (most recent call last): File "/Users/peter/projects/spiking-experiments/scratch/pure_theano_dynamic_scan.py", line 34, in ys2 = f2(xs) File "/Users/peter/projects/spiking-experiments/venv/lib/python2.7/site-packages/theano/compile/function_module.py",

[theano-users] Dynamically sized shared variable in scan loop:

2017-01-02 Thread Peter O'Connor
Hi all, I have the following (stateful) function, which integrates an input into a persistent variable, and outputs a rounded version of that. The function is demonstrated here. import numpy as np import theano import theano.tensor as tt from theano.ifelse import ifelse def int_and_fire(x):

Re: [theano-users] Any way to initialize shared variable with unknown shape?

2016-11-14 Thread Peter O'Connor
Ah, great, thanks guys. I get what you mean by the ifelse now. Here is a full example with the solution: import numpy as np import theano.tensor as T import theano from theano.ifelse import ifelse floatX = theano.config.floatX def temp_diff(x): last = theano.shared(np.zeros((0, )*x.ndim, dt

[theano-users] Re: it is about the dimension shuffle

2016-11-10 Thread Peter O'Connor
Hi Feras. The line self.layer0_input = input.reshape((66, 3, 100, 100)) Is wrong - it will put your data in the right shape, but the data will be meaningless. I think you want dimshuffle, as in: self.layer0_input = input.dimshuffle(0, 3, 1, 2) The bias-addition part looks right though. O

Re: [theano-users] Any way to initialize shared variable with unknown shape?

2016-11-10 Thread Peter O'Connor
Hi Fred, thanks for responding. Maybe I'm missing something, but I really can't see how `ifelse` helps here. I'm guessing you're referring to the solution in this post, but I can't see how it would apply here. -- --- You re

Re: [theano-users] Any way to initialize shared variable with unknown shape?

2016-11-10 Thread Peter O'Connor
Thanks Pascal, I tried your approach class TemporalDifference(object): def __init__(self): self.old = None def __call__(self, data): if self.old is None: self.old = theano.shared(np.zeros((1, )*data.ndim)) diff = data - self.old add_update(sel

Re: [theano-users] Any way to initialize shared variable with unknown shape?

2016-11-10 Thread Peter O'Connor
imensions and dtype are). > > So you could create a shared variable with shape (1, 1, ...) for > instance, and then either call set_value() or a function with > updates=... to properly initialize it when you actually know its shape. > > On Wed, Nov 09, 2016, Peter O'Conno

[theano-users] Any way to initialize shared variable with unknown shape?

2016-11-09 Thread Peter O'Connor
Hi all, I'm implementing a "temporal difference", which is just this: class TemporalDifference(object): def __init__(self, shape): self.old = theano.shared(np.zeros(shape)) def __call__(self, data): diff = data - self.old add_update(self.old, data) return

[theano-users] Theano interferes with PyCharm's Debugger?

2016-10-20 Thread Peter O'Connor
Hi all, This is not strictly a Theano error, but it arises whenever you run the PyCharm debugger on any code that imports theano. For some reason, when running in debug mode, with any code that imports theano, I get the additional error message: Exception TypeError: TypeError("'NoneType' object

Re: [theano-users] Convolution with sparse input image

2016-08-17 Thread Peter O'Connor
no/library/tensor/nnet/blocksparse.html?highlight=block%20sparse#module-tensor.nnet.blocksparse > > On Tue, Aug 16, 2016 at 11:31 AM, Peter O'Connor > wrote: > >> I'm looking for an efficient way to do convolution when the input >> images/feature maps are sparse. &

[theano-users] Convolution with sparse input image

2016-08-16 Thread Peter O'Connor
I'm looking for an efficient way to do convolution when the input images/feature maps are sparse. So you'd have a sparse input (n_samples, n_features_in, n_rows, n_cols), a dense kernel (n_features_out, n_features_in, n_kernel_rows, n_kernel_cols), and produce either a sparse or dense output

[theano-users] Convolution with sparse input image

2016-08-16 Thread Peter O'Connor
I'm looking for an efficient to do convolution when the input images/feature maps are sparse. So you'd have a sparse input (n_samples, n_features_in, n_rows, n_cols), a dense kernel (n_features_out, n_features_in, n_kernel_rows, n_kernel_cols), and produce either a sparse or dense output of s