Re: [julia-users] Escher global and local context

2016-02-01 Thread Leonardo

Hi Shashi,
this technique is applicable also for non-Signal variables, right?

However, persistence hints for Signals are very interestings ...

Thanks

Leonardo


Il 31/01/2016 19:11, Shashi Gowda ha scritto:

Hi Leonardo

If you have a single Escher process serving multiple requests and you
want to share some state between them, my recommendation would be to put
the shared data in signal, which can optionally be persisted to disk/DB

something like

if !isdefined(:my_global_var) # this condition makes sure all your UIs
are referring to the same variable
 my_global_signal = Signal(Any, initial_value)

 # you could also persist this to disk/DB for re-reading later, if
you have a function that can do that.
 Reactive.foreach(save_to_disk, my_global_signal)

 # optionally you could save throttle(1.0, my_global_signal) to make
sure the state is saved at most once every second so as to not hit the
disk often
end

function main(window)
# ... use my_global_signal here, to get and put updates: every
client will be notified of updates...
end



On Sun, Jan 31, 2016 at 5:47 PM, Leonardo mailto:maxsa...@gmail.com>> wrote:

Hi all,
I want write an application with a Web UI (e.g. a chat) that has a
local state (specific for each client) and global one (a state bind
to web server).
These requirements are useful for example to permit communications
between different client (e.g. into a chat) and/or handle
centralized resources (like a DB connection shared by all client and
created once into web server).

How can I do that?

Many thanks in advance

Leonardo




Re: [julia-users] Escher global and local context

2016-02-01 Thread Shashi Gowda
Yes this is how you can share non signal variables as well. However, if you
use a non-signal variable to share state, your state updates will not be
propagated across clients one the main function has executed.

If you persist state like in the example, you can load it and make it the
initial value of the shared signal after a restart, picking up where you
left off

On Mon 1 Feb, 2016, 1:44 PM Leonardo  wrote:

> Hi Shashi,
> this technique is applicable also for non-Signal variables, right?
>
> However, persistence hints for Signals are very interestings ...
>
> Thanks
>
> Leonardo
>
>
> Il 31/01/2016 19:11, Shashi Gowda ha scritto:
> > Hi Leonardo
> >
> > If you have a single Escher process serving multiple requests and you
> > want to share some state between them, my recommendation would be to put
> > the shared data in signal, which can optionally be persisted to disk/DB
> >
> > something like
> >
> > if !isdefined(:my_global_var) # this condition makes sure all your UIs
> > are referring to the same variable
> >  my_global_signal = Signal(Any, initial_value)
> >
> >  # you could also persist this to disk/DB for re-reading later, if
> > you have a function that can do that.
> >  Reactive.foreach(save_to_disk, my_global_signal)
> >
> >  # optionally you could save throttle(1.0, my_global_signal) to make
> > sure the state is saved at most once every second so as to not hit the
> > disk often
> > end
> >
> > function main(window)
> > # ... use my_global_signal here, to get and put updates: every
> > client will be notified of updates...
> > end
> >
> >
> >
> > On Sun, Jan 31, 2016 at 5:47 PM, Leonardo  > > wrote:
> >
> > Hi all,
> > I want write an application with a Web UI (e.g. a chat) that has a
> > local state (specific for each client) and global one (a state bind
> > to web server).
> > These requirements are useful for example to permit communications
> > between different client (e.g. into a chat) and/or handle
> > centralized resources (like a DB connection shared by all client and
> > created once into web server).
> >
> > How can I do that?
> >
> > Many thanks in advance
> >
> > Leonardo
> >
> >
>


[julia-users] Sparse matrix memory preallocation?

2016-02-01 Thread Kristoffer Carlsson
Create the vectors I J V which holds the nonzero rows, columns and values 
respectively and then call sparse(I, J, V). 

[julia-users] recommended graphics packages for multiple curves on a single canvas

2016-02-01 Thread Jon Norberg
using Gadfly

L=Layer[]
push!(L,layer(x=1:10,y=rand(10),Geom.line)[])
push!(L,layer(x=1:10,y=rand(10),Geom.line)[])
push!(L,layer(x=1:10,y=rand(10),Geom.line)[])
plot(L)

Only awkward thing is the empty square bracket for some reason is needed

Styling colours using Themes (see gadfly documentation)

cheers

[julia-users] Re: recommended graphics packages for multiple curves on a single canvas

2016-02-01 Thread Andreas Lobinger
Hello colleague,

Tom Breloff maintains the Plots.jl (https://github.com/tbreloff/Plots.jl)  
package which is a meta-plotting thing (a layer above the actual plotting 
and rendering toolkits) and he collects some examples in 
https://github.com/tbreloff/ExamplePlots.jl, e.g. 
https://github.com/tbreloff/ExamplePlots.jl/tree/master/docs 
More than 1 curve on a single plot is supported by most backends, subplot 
by many.

Hope that helps,
   Andreas

On Monday, February 1, 2016 at 4:53:30 AM UTC+1, Michael Landis wrote:
>
> Can anyone recommend a graphics capable of plotting multiple curves on a 
> single canvas.  Python's pyplot/matplotlib multi-curve capabilities appear 
> to be unavailable within Julia (maybe I'm doing it wrong).
>


[julia-users] Re: Sparse matrix memory preallocation?

2016-02-01 Thread Kristoffer Carlsson
At computer now.

Something like this:

function f(k)
I, J, V = Int[], Int[], Float64[]
for i = 1:k
idxs = (i-1)*2 + 1:i*2
for i in idxs, j in idxs
push!(I, i)
push!(J, j)
push!(V, rand())
end
end
return sparse(I,J,V)
end

@time f(1)
0.001932 seconds (71 allocations: 4.986 MB)



On Monday, February 1, 2016 at 9:25:22 AM UTC+1, Kristoffer Carlsson wrote:
>
> Create the vectors I J V which holds the nonzero rows, columns and values 
> respectively and then call sparse(I, J, V). 



[julia-users] Re: recommended graphics packages for multiple curves on a single canvas

2016-02-01 Thread Michael Landis
Gadfly already working.  I will look at Breloff's Plots too. Thanks.  




[julia-users] Re: Sparse matrix memory preallocation?

2016-02-01 Thread alan souza
You could try to use the triplet form (tree vectors containing the row/col 
indexes and the value of the entry) and call the function sparse.
In this way you can preallocate in advance these three vectors. 
However  doing in this way is more cumbersome because you need to have a 
good estimate of the number of entries and to explicitly calculate the 
index for all entries.

On Sunday, January 31, 2016 at 8:07:56 PM UTC-2, Gabriel Goh wrote:
>
> Generating a sparse matrix from scratch seems to be quite memory 
> intensive. and slow. Say I wish to create a large block diagonal matrix 
> with 2x2 block entries.
>
> Doing it naively is quite slow
>
> function f(k)
>   M = spzeros(2*k,2*k)
>   for i = 1:k
> D = (i-1)*2 + 1:i*2
> M[D,D] = randn(2,2)
>   end
>   return M
> end
>
> julia> @time f(1)
> 2.534277 seconds (239.26 k allocations: 3.013 GB, 15.58% gc time)
>
> Is there a way to speed this up by preallocating the memory somehow?
>


[julia-users] Range is not Array, Why?

2016-02-01 Thread Li ly
Hello, fellows, 

One question I feel a bit confused is why there is UnitRange/FloatRange 
since we have Array in Julia.

cheers,

Yungui



Re: [julia-users] How to unpack a .tar file in Julia?

2016-02-01 Thread Aslak Grinsted

I have a problem that i do not have tar on my windows machine, and for that 
reason Pkg.build("GR") fails. I tried fixing it using 7zip as that was 
present in my julia bin folder by replacing 

run(`tar xzf downloads/$tarball`)

with 

run(pipeline(`7z x -so downloads/$tarball`,`7z x -aoa -si -ttar -o"."`))

--- That worked fine on my windows machine, but fails on 
travis: https://travis-ci.org/jheinen/GR.jl/jobs/106192456

Any tips are welcome. 








On Tuesday, March 10, 2015 at 9:33:06 PM UTC+1, Stefan Karpinski wrote:
>
> I would just shell out to the tar command and then work with the untarred 
> directory.
>
> On Tue, Mar 10, 2015 at 4:05 PM, Weijian Zhang  > wrote:
>
>> Hello,
>>
>> I have a .tar.gz file. With GZip.jl, I can write code to unzip it to a 
>> .tar file.
>> But how can I unpack this .tar file in Julia?
>>
>> Thanks,
>>
>> Weijian
>>
>>
>

[julia-users] Newbie Question : Averaging neighbours for every element

2016-02-01 Thread Omkar Patil
Hello,

I have a 3D array of float64 elements. And I need to average values of 
neighbouring elements in a sphere for every element.
Map can run a function on every element. But how can send indices of 
current element so that the function knows neighbouring elements for each 
element.

I have just started studying Julia. Sorry for my naivety.

Thanks and regards,
Omkar




Re: [julia-users] Newbie Question : Averaging neighbours for every element

2016-02-01 Thread Stuart Brorson

Why don't you just write a set of nested "for" loops and iterate over
each element?  One of Julia's strengths is that "for" loops are fast,
so you don't need to worry about twisting your logic or your code
around in order to vectorize it as you do in Matlab or other such
languages.  If your logic is naturally expressed using "for" loops,
then write it out that way.

Stuart


On Sun, 31 Jan 2016, Omkar Patil wrote:


Hello,

I have a 3D array of float64 elements. And I need to average values of
neighbouring elements in a sphere for every element.
Map can run a function on every element. But how can send indices of
current element so that the function knows neighbouring elements for each
element.

I have just started studying Julia. Sorry for my naivety.

Thanks and regards,
Omkar





Re: [julia-users] Newbie Question : Averaging neighbours for every element

2016-02-01 Thread Rafael Fourquet
This in-preparation blogpost may be relevant?
https://github.com/JuliaLang/julialang.github.com/pull/324/files


Re: [julia-users] Range is not Array, Why?

2016-02-01 Thread Mauro
Ranges are a more compact representation of vectors with evenly spaced
elements:

julia> xdump(1:5:1000)
StepRange{Int64,Int64}
  start: Int64 1
  step: Int64 5
  stop: Int64 996

i.e. it only uses 3 numbers instead of 200 (for this example).  Anyway,
you should be able to use a range just like any other Vector as long as
you only read from it.  (It is a bug if a function of Julia-Base works
with a Vector but not a Range in a read-only context (please report
it).)

If you need to write to it, you need to convert it to a normal vector
first:
v = collect(1:10)

This does trip up new users though, see e.g. this long thread:
https://groups.google.com/d/msg/julia-users/qPqgJS-usrU/Hu40_tOlDQAJ

On Mon, 2016-02-01 at 01:29, Li ly  wrote:
> Hello, fellows,
>
> One question I feel a bit confused is why there is UnitRange/FloatRange
> since we have Array in Julia.
>
> cheers,
>
> Yungui


[julia-users] Remove Gadfly gridlines?

2016-02-01 Thread Jon Norberg
I have searched and tried a few things but cannot remove the background grids 
in Gadfly. Its probably simple and I am missing something obvious...Any 
suggestions would be appreciated.

layer(x=E,y=wetness(E,10.0), Geom.line,Theme(default_color=a[1], 
line_width=2pt, grid_color=colorant"white")

also tried grid_line_width=0pt

but they still show up ( I save it as SVG, but also output in jupiter shows 
them)

Re: [julia-users] Remove Gadfly gridlines?

2016-02-01 Thread Tom Breloff
It's certainly possible, as I can do this:

[image: Inline image 1]

Looking at my source (
https://github.com/tbreloff/Plots.jl/blob/master/src/backends/gadfly.jl#L490-L492)
it seems like I'm setting the "grid_color" keyword in the "Gadfly.Theme"
constructor to match the background color.  However that seems to be what
you already tried, so I'm not sure what's different.  I tried this is
IJulia and at the REPL... both results were the same for me.

On Mon, Feb 1, 2016 at 8:58 AM, Jon Norberg 
wrote:

> I have searched and tried a few things but cannot remove the background
> grids in Gadfly. Its probably simple and I am missing something
> obvious...Any suggestions would be appreciated.
>
> layer(x=E,y=wetness(E,10.0), Geom.line,Theme(default_color=a[1],
> line_width=2pt, grid_color=colorant"white")
>
> also tried grid_line_width=0pt
>
> but they still show up ( I save it as SVG, but also output in jupiter
> shows them)


Re: [julia-users] Need Ref on UInt32 but not Float64

2016-02-01 Thread Erik Schnetter
Can you be more specific? What happens if you don't use `Ref`?

`Ref` is clearly wrong given the callback signature you provide.

Also, you may want to use `Cuint` as type instead of `UInt32`.

-erik

On Mon, Feb 1, 2016 at 12:50 AM, Bryan Rivera  wrote:
> For some reason I need to specify Ref{UInt32} instead of just UInt32.
> Float64 does not seem to require the same.
>
> Can anyone explain this behavior?
>
> function callbackfun(a::UInt32, b::Float64, c::UInt32)
> ...
> end
>
> cfunction(callbackfun, Void, (Ref{UInt32},Float64,Ref{UInt32}))
>
>
> typedef void (*CALLBACKFUN)(unsigned int z, double b, unsigned int c);
>
>
>



-- 
Erik Schnetter 
http://www.perimeterinstitute.ca/personal/eschnetter/


[julia-users] Re: Sparse matrix memory preallocation?

2016-02-01 Thread Kristoffer Carlsson
> However  doing in this way is more cumbersome because you need to have a 
good estimate of the number of entries

Not true. The difference between pre allocating the arrays and just pushing 
into them is not that large due to how julia arrays work (constant 
ammortized time etc).


On Monday, February 1, 2016 at 1:34:12 PM UTC+1, alan souza wrote:
>
> You could try to use the triplet form (tree vectors containing the row/col 
> indexes and the value of the entry) and call the function sparse.
> In this way you can preallocate in advance these three vectors. 
> However  doing in this way is more cumbersome because you need to have a 
> good estimate of the number of entries and to explicitly calculate the 
> index for all entries.
>
> On Sunday, January 31, 2016 at 8:07:56 PM UTC-2, Gabriel Goh wrote:
>>
>> Generating a sparse matrix from scratch seems to be quite memory 
>> intensive. and slow. Say I wish to create a large block diagonal matrix 
>> with 2x2 block entries.
>>
>> Doing it naively is quite slow
>>
>> function f(k)
>>   M = spzeros(2*k,2*k)
>>   for i = 1:k
>> D = (i-1)*2 + 1:i*2
>> M[D,D] = randn(2,2)
>>   end
>>   return M
>> end
>>
>> julia> @time f(1)
>> 2.534277 seconds (239.26 k allocations: 3.013 GB, 15.58% gc time)
>>
>> Is there a way to speed this up by preallocating the memory somehow?
>>
>

Re: [julia-users] Re: recommended graphics packages for multiple curves on a single canvas

2016-02-01 Thread Tom Breloff
Michael: If you have a specific plot-type in mind let me know... I'm
usually pretty quick to whip up an example or point you in the right
direction.  Not all features are documented well (yet) so it might be
faster to ask for help.

On Mon, Feb 1, 2016 at 4:48 AM, Michael Landis 
wrote:

> Gadfly already working.  I will look at Breloff's Plots too. Thanks.
>
>
>


Re: [julia-users] Re: Sparse matrix memory preallocation?

2016-02-01 Thread Erik Schnetter
Compare this function:

```Julia
function f2(k)
M = spzeros(2*k,2*k)
for i = 1:k
j1 = (i-1)*2+1
j2 = i*2
M[j1,j1] = rand()
M[j2,j1] = rand()
M[j1,j2] = rand()
M[j2,j2] = rand()
end
return M
end
```
which is much faster. It seems your original code has two performance
issues that are unrelated to sparse matrix memory allocation:
(1) `randn` allocates a new matrix every time
(2) Something about indexing sparse matrices with ranges seems slow (I
don't know why)

If you want to continue to use `randn`, then you can use `randn!`
instead, and preallocate the small matrix outside the loop.

-erik


On Mon, Feb 1, 2016 at 9:42 AM, Kristoffer Carlsson
 wrote:
>> However  doing in this way is more cumbersome because you need to have a
>> good estimate of the number of entries
>
> Not true. The difference between pre allocating the arrays and just pushing
> into them is not that large due to how julia arrays work (constant
> ammortized time etc).
>
>
> On Monday, February 1, 2016 at 1:34:12 PM UTC+1, alan souza wrote:
>>
>> You could try to use the triplet form (tree vectors containing the row/col
>> indexes and the value of the entry) and call the function sparse.
>> In this way you can preallocate in advance these three vectors.
>> However  doing in this way is more cumbersome because you need to have a
>> good estimate of the number of entries and to explicitly calculate the index
>> for all entries.
>>
>> On Sunday, January 31, 2016 at 8:07:56 PM UTC-2, Gabriel Goh wrote:
>>>
>>> Generating a sparse matrix from scratch seems to be quite memory
>>> intensive. and slow. Say I wish to create a large block diagonal matrix with
>>> 2x2 block entries.
>>>
>>> Doing it naively is quite slow
>>>
>>> function f(k)
>>>   M = spzeros(2*k,2*k)
>>>   for i = 1:k
>>> D = (i-1)*2 + 1:i*2
>>> M[D,D] = randn(2,2)
>>>   end
>>>   return M
>>> end
>>>
>>> julia> @time f(1)
>>> 2.534277 seconds (239.26 k allocations: 3.013 GB, 15.58% gc time)
>>>
>>> Is there a way to speed this up by preallocating the memory somehow?



-- 
Erik Schnetter 
http://www.perimeterinstitute.ca/personal/eschnetter/


Re: [julia-users] Re: deep learning for regression?

2016-02-01 Thread Tom Breloff
One thing to keep in mind is that of stability.  Small changes to weights
in the early layer of a deep feedforward network might have large impacts
on the final regression result. This is not as big of a problem in
classification tasks because the final result is squashed to a small range
(usually between [0,1]).  However in a regression problem you run a risk
where overfitting your model create large instabilities in the final
results.  You won't have these instabilities from a simple linear
regression because there just aren't as many degrees of freedom to
overfit.  The extreme non-linearities of a deep network exacerbate the
problem.

If you're interested in learning, feel free to look through my code in
https://github.com/tbreloff/OnlineAI.jl.  It's pure-julia, and it's very
straightforward to create deep nets with online normalization, dropout,
many solvers (AdaDelta, AdaMax, etc) and some nice ways to sample
datasets.  It is very much "experimental" in the sense that I am not
building it with the intent of being the next TensorFlow... however it's
less of a black box compared to other frameworks so you mind find it
helpful when learning.  It is also not GPU optimized, so it won't be the
best for really large problems.

On Sun, Jan 31, 2016 at 10:24 PM, Jason Eckstein 
wrote:

> The reason why most of the deep learning focus is on classification is
> because image classification and voice recognition is where all the
> research money and focus is for the large companies that are investing in
> machine learning, i.e. Google, Baidu, Facebook, Microsoft, etc  Also a
> number of important competitions focus on image recognition.  That doesn't
> mean that the same success, challenges, and tools don't exist for
> regression problems, it's just not a primary focus in the community right
> now.  There are a number of interesting papers on solving regression
> problems and all the regularization techniques and network architectures
> that are useful but it really depends on your particular problem and how
> much data you have to work with.  I can tell you from personal experience
> that neural networks can be very effective at solving these types of
> problems and fitting very complex functions but doing it correctly requires
> careful regularization and choosing the right network architecture.  Also,
> if you use stochastic gradient descent or batch gradient descent, redundant
> inputs are not really a problem at all.  Since these problems are so
> specific to each problem you're trying to solve I can't really give you
> more help other than this general advice.  A more detailed discussion of
> all this is probably better suited to a private email exchange.  As far as
> the most modern state of the art techniques go, many of them can be applied
> to regression problems as well even though the use examples shown are for
> classification.
>
>
> On Saturday, January 30, 2016 at 7:46:06 AM UTC-7, michae...@gmail.com
> wrote:
>>
>> Thanks, that's pretty much my understanding. Scaling the inputs seems to
>> be important, too, from what I read. I'm also interested in a framework
>> that will trim off redundant inputs.
>>
>> I have run the mocha tutorial examples, and it looks very promising
>> because the structure is clear, and there are C++ and cuda backends. The
>> C++ backend, with openmp, gives me a good performance boost over the pure
>> Julia backend. However, I'm not so sure that it will allow for trimming
>> redundant inputs. Also, I have some ideas on how to restrict the net to
>> remove observationally equivalent configurations, which should aid in
>> training, and I don't think I could implement those ideas with mocha.
>>
>> From what I see, the focus of much recent work in neural nets seems to be
>> on classification and labeling of images, and regression examples using the
>> modern tools seem to be scarce. I'm wondering if that's because other tools
>> work better for regression, or simply because it's an old problem that is
>> considered to be well studied. I would like to see some examples of
>> regression nets that work well, using the modern tools, though, if there
>> are any out there.
>>
>> On Saturday, January 30, 2016 at 2:32:16 PM UTC+1, Jason Eckstein wrote:
>>>
>>> I've been using NN for regression and I've experimented with Mocha.  I
>>> ended up coding my own network for speed purposes but in general you simply
>>> leave the final output of the neural network as a linear combination
>>> without applying an activation function.  That way the output can represent
>>> a real number rather than compress it into a 0 to 1 or -1 to 1 range for
>>> classification.  You can leave the rest of the network unchanged.
>>>
>>> On Saturday, January 30, 2016 at 3:45:27 AM UTC-7, michae...@gmail.com
>>> wrote:

 I'm interested in using neural networks (deep learning) for
 multivariate multiple regression, with multiple real valued inputs and
 multiple real valued outputs. At the moment

Re: [julia-users] Plotting with Plot

2016-02-01 Thread Tom Breloff
Maybe a silly question, but does the output look bad?  Or are you just
reporting the warnings?  I don't know enough about Pango to say exactly how
to fix this.  You could try installing the font in your system, or you can
override the defaults with something like:

using Plots
gadfly()
f = "Sans"
default(tickfont = font(f,8), guidefont = font(f,11), legendfont =
font(f,8))

note: you can add this to your ~/.juliarc.jl file to always run this on
REPL startup.


On Sun, Jan 31, 2016 at 6:57 PM, digxx  wrote:

> I use Plot as Plotting Device with Gadfly Backend and used
>
> savefig("testfile") to save my previously plotted figure..
> Apparently it has some issues with the font. Do I need to load it in Julia
> seperately?
>
> (julia.exe:6312): Pango-WARNING **: couldn't load font "Helvetica
> Not-Rotated 10.669921875px", falling back to "Sans Not-Rotated
> 10.669921875px", expect ugly output.
>
> (julia.exe:6312): Pango-WARNING **: couldn't load font "Helvetica
> Not-Rotated 14.669921875px", falling back to "Sans Not-Rotated
> 14.669921875px", expect ugly output.
>


Re: [julia-users] Re: Sparse matrix memory preallocation?

2016-02-01 Thread Stefan Karpinski
randn() also has a scalar form.

On Mon, Feb 1, 2016 at 9:56 AM, Erik Schnetter  wrote:

> Compare this function:
>
> ```Julia
> function f2(k)
> M = spzeros(2*k,2*k)
> for i = 1:k
> j1 = (i-1)*2+1
> j2 = i*2
> M[j1,j1] = rand()
> M[j2,j1] = rand()
> M[j1,j2] = rand()
> M[j2,j2] = rand()
> end
> return M
> end
> ```
> which is much faster. It seems your original code has two performance
> issues that are unrelated to sparse matrix memory allocation:
> (1) `randn` allocates a new matrix every time
> (2) Something about indexing sparse matrices with ranges seems slow (I
> don't know why)
>
> If you want to continue to use `randn`, then you can use `randn!`
> instead, and preallocate the small matrix outside the loop.
>
> -erik
>
>
> On Mon, Feb 1, 2016 at 9:42 AM, Kristoffer Carlsson
>  wrote:
> >> However  doing in this way is more cumbersome because you need to have a
> >> good estimate of the number of entries
> >
> > Not true. The difference between pre allocating the arrays and just
> pushing
> > into them is not that large due to how julia arrays work (constant
> > ammortized time etc).
> >
> >
> > On Monday, February 1, 2016 at 1:34:12 PM UTC+1, alan souza wrote:
> >>
> >> You could try to use the triplet form (tree vectors containing the
> row/col
> >> indexes and the value of the entry) and call the function sparse.
> >> In this way you can preallocate in advance these three vectors.
> >> However  doing in this way is more cumbersome because you need to have a
> >> good estimate of the number of entries and to explicitly calculate the
> index
> >> for all entries.
> >>
> >> On Sunday, January 31, 2016 at 8:07:56 PM UTC-2, Gabriel Goh wrote:
> >>>
> >>> Generating a sparse matrix from scratch seems to be quite memory
> >>> intensive. and slow. Say I wish to create a large block diagonal
> matrix with
> >>> 2x2 block entries.
> >>>
> >>> Doing it naively is quite slow
> >>>
> >>> function f(k)
> >>>   M = spzeros(2*k,2*k)
> >>>   for i = 1:k
> >>> D = (i-1)*2 + 1:i*2
> >>> M[D,D] = randn(2,2)
> >>>   end
> >>>   return M
> >>> end
> >>>
> >>> julia> @time f(1)
> >>> 2.534277 seconds (239.26 k allocations: 3.013 GB, 15.58% gc time)
> >>>
> >>> Is there a way to speed this up by preallocating the memory somehow?
>
>
>
> --
> Erik Schnetter 
> http://www.perimeterinstitute.ca/personal/eschnetter/
>


Re: [julia-users] Re: recommended graphics packages for multiple curves on a single canvas

2016-02-01 Thread Miguel Bazdresch
Gaston can do that, too. Just be sure to use master instead of the latest
release; there's tons of bug fixes and improvements. The PDF documentation
is here: https://bitbucket.org/mbaz/gaston/downloads/gastondoc-0.5.5.pdf.

-- mb

On Mon, Feb 1, 2016 at 9:45 AM, Tom Breloff  wrote:

> Michael: If you have a specific plot-type in mind let me know... I'm
> usually pretty quick to whip up an example or point you in the right
> direction.  Not all features are documented well (yet) so it might be
> faster to ask for help.
>
> On Mon, Feb 1, 2016 at 4:48 AM, Michael Landis 
> wrote:
>
>> Gadfly already working.  I will look at Breloff's Plots too. Thanks.
>>
>>
>>
>


[julia-users] Re: deep learning for regression?

2016-02-01 Thread michael . creel
Thanks everyone for the comments and pointers to code. I have coded up a 
simple example, fitting y=sin(x) + error, and the results very good, enough 
so that I'll certainly be investigating further with larger scale problems. 
I may try to use one of the existing packages, but it may be convenient to 
use the MPI package to distribute the training data, as I have a 32 core 
machine. If I come up with some interesting code, I'll post it here.


Re: [julia-users] Remove Gadfly gridlines?

2016-02-01 Thread Rob J. Goedman
Hi Tom,

Which version of Plots are you using?

Rob




julia> using Plots

julia> gadfly(size=(400,200))
ERROR: ArgumentError: function gadfly does not accept keyword arguments

julia> Pkg.installed("Plots")
v"0.5.1"

julia> versioninfo()
Julia Version 0.4.3
Commit a2f713d (2016-01-12 21:37 UTC)
Platform Info:
  System: Darwin (x86_64-apple-darwin13.4.0)
  CPU: Intel(R) Core(TM) i7-4980HQ CPU @ 2.80GHz
  WORD_SIZE: 64
  BLAS: libopenblas (USE64BITINT DYNAMIC_ARCH NO_AFFINITY Haswell)
  LAPACK: libopenblas64_
  LIBM: libopenlibm
  LLVM: libLLVM-3.3


> On Feb 1, 2016, at 06:38, Tom Breloff  wrote:
> 
> It's certainly possible, as I can do this:
> 
> 
> 
> Looking at my source 
> (https://github.com/tbreloff/Plots.jl/blob/master/src/backends/gadfly.jl#L490-L492
>  
> )
>  it seems like I'm setting the "grid_color" keyword in the "Gadfly.Theme" 
> constructor to match the background color.  However that seems to be what you 
> already tried, so I'm not sure what's different.  I tried this is IJulia and 
> at the REPL... both results were the same for me.
> 
> On Mon, Feb 1, 2016 at 8:58 AM, Jon Norberg  > wrote:
> I have searched and tried a few things but cannot remove the background grids 
> in Gadfly. Its probably simple and I am missing something obvious...Any 
> suggestions would be appreciated.
> 
> layer(x=E,y=wetness(E,10.0), Geom.line,Theme(default_color=a[1], 
> line_width=2pt, grid_color=colorant"white")
> 
> also tried grid_line_width=0pt
> 
> but they still show up ( I save it as SVG, but also output in jupiter shows 
> them)
> 



[julia-users] Re: recommended graphics packages for multiple curves on a single canvas

2016-02-01 Thread David P. Sanders

With PyPlot it should be as simple as

using PyPlot

x = 0:0.01:1
plot(x, sin(x))
plot(x, cos(x))

Does this not work?


El domingo, 31 de enero de 2016, 21:53:30 (UTC-6), Michael Landis escribió:
>
> Can anyone recommend a graphics capable of plotting multiple curves on a 
> single canvas.  Python's pyplot/matplotlib multi-curve capabilities appear 
> to be unavailable within Julia (maybe I'm doing it wrong).
>


[julia-users] Parametric Type Question - updating type

2016-02-01 Thread Christopher Alexander
Hello all, I have a question about the usage of parametric types.  I know 
these bring about a performance boost (at least that was my understanding), 
but I have a situation where I have a parametric type defined as such:

type PricingEngine{T <: TermStructure}
 varA::Float64
 varB::Float64
 ts::T
end


But then I need to actually swap the existing term structure with another 
subtype of TermStructure further down the road. Using parametric types, it 
complains because I guess it's locked in to using whatever TermStructure 
sub type is initially there when I instantiate the PricingEngine type.  Is 
there anyway to do such an update while still using a type parameter, or am 
I stuck just with a definition that uses the broader abstract type?

Thanks!

Chris


Re: [julia-users] Need Ref on UInt32 but not Float64

2016-02-01 Thread Bryan Rivera
Hi Eric,

I get 'garbage' results if I don't use Ref.  

The number returned may change slightly, but it is way too large.  I don't 
know if it's a memory address or what.




[julia-users] Memory management in Julia

2016-02-01 Thread Madeleine Udell
Hi all,

I'm running into some memory management issues: in particular, a malloc 
error that claims I am modifying an object after freeing it: see this 
question. 

 The 
error is, 

julia(9849,0x7fff705d0300) malloc: *** error for object 0x7f96a332f408: 
incorrect checksum for freed object - object was probably modified after 
being freed. *** set a breakpoint in malloc_error_break to debug

I'm not sure how to debug it: what's the best way to search for code that 
might be modifying an object after freeing it in Julia? (For example, I 
don't know what or where malloc_error_break is.)

Thanks!
Madeleine


[julia-users] Re: Has module pre-compilation has been back-ported to Julia 0.3.11?

2016-02-01 Thread mcarrizosa
Upgrading is not straightforward since we have customers with 6-12 month 
upgrade cycles.

On Saturday, January 30, 2016 at 1:16:36 AM UTC-8, Tero Frondelius wrote:
>
> Do you mind me asking: why would you need an old version of Julia?



Re: [julia-users] Need Ref on UInt32 but not Float64

2016-02-01 Thread Yichao Yu
On Feb 1, 2016 1:38 PM, "Bryan Rivera"  wrote:
>
> Hi Eric,
>
> I get 'garbage' results if I don't use Ref.
>
> The number returned may change slightly, but it is way too large.  I
don't know if it's a memory address or what.

What platform it is and are you using the right abi if you are on Windows?

>
>


Re: [julia-users] Remove Gadfly gridlines?

2016-02-01 Thread Tom Breloff
This should work on master.  Do Pkg.checkout("Plots").  Before that, you
would do "gadfly(); default(size=(400,200))"

On Mon, Feb 1, 2016 at 1:12 PM, Rob J. Goedman  wrote:

> Hi Tom,
>
> Which version of Plots are you using?
>
> Rob
>
>
>
> *julia> **using Plots*
>
> *julia> **gadfly(size=(400,200))*
> *ERROR: ArgumentError: function gadfly does not accept keyword arguments*
>
> *julia> **Pkg.installed("Plots")*
> *v"0.5.1"*
>
> *julia> **versioninfo()*
> Julia Version 0.4.3
> Commit a2f713d (2016-01-12 21:37 UTC)
> Platform Info:
>   System: Darwin (x86_64-apple-darwin13.4.0)
>   CPU: Intel(R) Core(TM) i7-4980HQ CPU @ 2.80GHz
>   WORD_SIZE: 64
>   BLAS: libopenblas (USE64BITINT DYNAMIC_ARCH NO_AFFINITY Haswell)
>   LAPACK: libopenblas64_
>   LIBM: libopenlibm
>   LLVM: libLLVM-3.3
>
>
> On Feb 1, 2016, at 06:38, Tom Breloff  wrote:
>
> It's certainly possible, as I can do this:
>
> 
>
> Looking at my source (
> https://github.com/tbreloff/Plots.jl/blob/master/src/backends/gadfly.jl#L490-L492)
> it seems like I'm setting the "grid_color" keyword in the "Gadfly.Theme"
> constructor to match the background color.  However that seems to be what
> you already tried, so I'm not sure what's different.  I tried this is
> IJulia and at the REPL... both results were the same for me.
>
> On Mon, Feb 1, 2016 at 8:58 AM, Jon Norberg 
> wrote:
>
>> I have searched and tried a few things but cannot remove the background
>> grids in Gadfly. Its probably simple and I am missing something
>> obvious...Any suggestions would be appreciated.
>>
>> layer(x=E,y=wetness(E,10.0), Geom.line,Theme(default_color=a[1],
>> line_width=2pt, grid_color=colorant"white")
>>
>> also tried grid_line_width=0pt
>>
>> but they still show up ( I save it as SVG, but also output in jupiter
>> shows them)
>
>
>
>


Re: [julia-users] Parametric Type Question - updating type

2016-02-01 Thread Tom Breloff
You could just construct a new object with the new TermStructure, instead
of overwriting the old one.

On Mon, Feb 1, 2016 at 1:27 PM, Christopher Alexander 
wrote:

> Hello all, I have a question about the usage of parametric types.  I know
> these bring about a performance boost (at least that was my understanding),
> but I have a situation where I have a parametric type defined as such:
>
> type PricingEngine{T <: TermStructure}
>  varA::Float64
>  varB::Float64
>  ts::T
> end
>
>
> But then I need to actually swap the existing term structure with another
> subtype of TermStructure further down the road. Using parametric types, it
> complains because I guess it's locked in to using whatever TermStructure
> sub type is initially there when I instantiate the PricingEngine type.  Is
> there anyway to do such an update while still using a type parameter, or am
> I stuck just with a definition that uses the broader abstract type?
>
> Thanks!
>
> Chris
>


Re: [julia-users] Memory management in Julia

2016-02-01 Thread Yichao Yu
On Mon, Feb 1, 2016 at 1:39 PM, Madeleine Udell
 wrote:
> Hi all,
>
> I'm running into some memory management issues: in particular, a malloc
> error that claims I am modifying an object after freeing it: see this
> question. The error is,
>
> julia(9849,0x7fff705d0300) malloc: *** error for object 0x7f96a332f408:
> incorrect checksum for freed object - object was probably modified after
> being freed. *** set a breakpoint in malloc_error_break to debug
>
> I'm not sure how to debug it: what's the best way to search for code that
> might be modifying an object after freeing it in Julia? (For example, I
> don't know what or where malloc_error_break is.)

Which platform is this and how repeatable is it?

>
> Thanks!
> Madeleine


Re: [julia-users] Re: Has module pre-compilation has been back-ported to Julia 0.3.11?

2016-02-01 Thread Stefan Karpinski
It's completely understandable to not be free to upgrade. Unfortunately,
module precompilation is a highly non-trivial change that cannot reasonably
be backported to the 0.3 branch, which is now in maintenance mode. The best
you can do is to bake the necessary packages into your system image
via the userimg
trick .

On Mon, Feb 1, 2016 at 1:41 PM,  wrote:

> Upgrading is not straightforward since we have customers with 6-12 month
> upgrade cycles.
>
> On Saturday, January 30, 2016 at 1:16:36 AM UTC-8, Tero Frondelius wrote:
>>
>> Do you mind me asking: why would you need an old version of Julia?
>
>


Re: [julia-users] Memory management in Julia

2016-02-01 Thread Madeleine Udell
This is on a mac; we've got a variety of function calls giving errors, some
with probability ~.5 and some every time we've run them.

On Mon, Feb 1, 2016 at 10:51 AM, Yichao Yu  wrote:

> On Mon, Feb 1, 2016 at 1:39 PM, Madeleine Udell
>  wrote:
> > Hi all,
> >
> > I'm running into some memory management issues: in particular, a malloc
> > error that claims I am modifying an object after freeing it: see this
> > question. The error is,
> >
> > julia(9849,0x7fff705d0300) malloc: *** error for object 0x7f96a332f408:
> > incorrect checksum for freed object - object was probably modified after
> > being freed. *** set a breakpoint in malloc_error_break to debug
> >
> > I'm not sure how to debug it: what's the best way to search for code that
> > might be modifying an object after freeing it in Julia? (For example, I
> > don't know what or where malloc_error_break is.)
>
> Which platform is this and how repeatable is it?
>
> >
> > Thanks!
> > Madeleine
>



-- 
Madeleine Udell
Postdoctoral Fellow at the Center for the Mathematics of Information
California Institute of Technology
*https://courses2.cit.cornell.edu/mru8
*
(415) 729-4115


[julia-users] ANN: new blog post on array indexing, iteration, and multidimensional algorithms

2016-02-01 Thread Tim Holy
It's come to my attention that some of the exciting capabilities of julia 0.4 
for indexing, iteration, and writing generic multidimensional algorithms may 
be under-documented. This new blog post

http://julialang.org/blog/2016/02/iteration/

aims to provide a "gentle tutorial" illustrating how to use these 
capabilities.

Best,
--Tim



Re: [julia-users] Parametric Type Question - updating type

2016-02-01 Thread Christopher Alexander
So something like:

function update_ts(pe::PricingEngine, newTS::TermStructure)
 newPE = PricingEngine(pe.varA, pe.varB, newTS)
 return newPE
end


myPE = PricingEngine(4.5, 5.5, TermStructureA())


myPE = update_ts(myPE, TermStructureB())

You probably wouldn't be able to update the "myPE" object in place right 
(i.e. updating it in the actual update_ts method and then returning itself)?


On Monday, February 1, 2016 at 1:50:41 PM UTC-5, Tom Breloff wrote:
>
> You could just construct a new object with the new TermStructure, instead 
> of overwriting the old one.
>
> On Mon, Feb 1, 2016 at 1:27 PM, Christopher Alexander  > wrote:
>
>> Hello all, I have a question about the usage of parametric types.  I know 
>> these bring about a performance boost (at least that was my understanding), 
>> but I have a situation where I have a parametric type defined as such:
>>
>> type PricingEngine{T <: TermStructure}
>>  varA::Float64
>>  varB::Float64
>>  ts::T
>> end
>>
>>
>> But then I need to actually swap the existing term structure with another 
>> subtype of TermStructure further down the road. Using parametric types, it 
>> complains because I guess it's locked in to using whatever TermStructure 
>> sub type is initially there when I instantiate the PricingEngine type.  Is 
>> there anyway to do such an update while still using a type parameter, or am 
>> I stuck just with a definition that uses the broader abstract type?
>>
>> Thanks!
>>
>> Chris
>>
>
>

Re: [julia-users] Need Ref on UInt32 but not Float64

2016-02-01 Thread Bryan Rivera
 


Operating System: CentOS Linux 7 (Core)


Kernel: Linux 3.10.0-229.20.1.el7.x86_64


centos-release-7-1.1503.el7.centos.2.8.x86_64



Just ran it again to get some values for you all:


The numbers do not change.  One is 3056772480 and the other 3056772572.


The values are supposed to be varying and on the order of 29105550 and 1000 
respectively.



On Monday, February 1, 2016 at 1:48:00 PM UTC-5, Yichao Yu wrote:
>
>
> On Feb 1, 2016 1:38 PM, "Bryan Rivera" > 
> wrote:
> >
> > Hi Eric,
> >
> > I get 'garbage' results if I don't use Ref.  
> >
> > The number returned may change slightly, but it is way too large.  I 
> don't know if it's a memory address or what.
>
> What platform it is and are you using the right abi if you are on Windows?
>
> >
> >
>


Re: [julia-users] Parametric Type Question - updating type

2016-02-01 Thread Christopher Alexander
This doesn't seem to work if your PricingEngine object is attached to some 
other object.  Like, for example if you have:

type Bond
 pricingEngine::PricingEngine
end


myPE = PricingEngine(4.5, 5.5, TermStructureA())


myBond = Bond(myPE)


myPE = update_ts(myPE, TermStructureB())


At that point, myBond's pricing engine still points to the older myPE with 
TermStructureA.

On Monday, February 1, 2016 at 1:58:54 PM UTC-5, Christopher Alexander 
wrote:
>
> So something like:
>
> function update_ts(pe::PricingEngine, newTS::TermStructure)
>  newPE = PricingEngine(pe.varA, pe.varB, newTS)
>  return newPE
> end
>
>
> myPE = PricingEngine(4.5, 5.5, TermStructureA())
>
>
> myPE = update_ts(myPE, TermStructureB())
>
> You probably wouldn't be able to update the "myPE" object in place right 
> (i.e. updating it in the actual update_ts method and then returning itself)?
>
>
> On Monday, February 1, 2016 at 1:50:41 PM UTC-5, Tom Breloff wrote:
>>
>> You could just construct a new object with the new TermStructure, instead 
>> of overwriting the old one.
>>
>> On Mon, Feb 1, 2016 at 1:27 PM, Christopher Alexander  
>> wrote:
>>
>>> Hello all, I have a question about the usage of parametric types.  I 
>>> know these bring about a performance boost (at least that was my 
>>> understanding), but I have a situation where I have a parametric type 
>>> defined as such:
>>>
>>> type PricingEngine{T <: TermStructure}
>>>  varA::Float64
>>>  varB::Float64
>>>  ts::T
>>> end
>>>
>>>
>>> But then I need to actually swap the existing term structure with 
>>> another subtype of TermStructure further down the road. Using parametric 
>>> types, it complains because I guess it's locked in to using whatever 
>>> TermStructure sub type is initially there when I instantiate the 
>>> PricingEngine type.  Is there anyway to do such an update while still using 
>>> a type parameter, or am I stuck just with a definition that uses the 
>>> broader abstract type?
>>>
>>> Thanks!
>>>
>>> Chris
>>>
>>
>>

Re: [julia-users] Memory management in Julia

2016-02-01 Thread Matt Bauman
Can you reproduce it if you run julia with the --check-bounds=yes command 
line argument?

On Monday, February 1, 2016 at 1:55:25 PM UTC-5, Madeleine Udell wrote:
>
> This is on a mac; we've got a variety of function calls giving errors, 
> some with probability ~.5 and some every time we've run them.
>
> On Mon, Feb 1, 2016 at 10:51 AM, Yichao Yu 
> > wrote:
>
>> On Mon, Feb 1, 2016 at 1:39 PM, Madeleine Udell
>> > wrote:
>> > Hi all,
>> >
>> > I'm running into some memory management issues: in particular, a malloc
>> > error that claims I am modifying an object after freeing it: see this
>> > question. The error is,
>> >
>> > julia(9849,0x7fff705d0300) malloc: *** error for object 0x7f96a332f408:
>> > incorrect checksum for freed object - object was probably modified after
>> > being freed. *** set a breakpoint in malloc_error_break to debug
>> >
>> > I'm not sure how to debug it: what's the best way to search for code 
>> that
>> > might be modifying an object after freeing it in Julia? (For example, I
>> > don't know what or where malloc_error_break is.)
>>
>> Which platform is this and how repeatable is it?
>>
>> >
>> > Thanks!
>> > Madeleine
>>
>
>
>
> -- 
> Madeleine Udell
> Postdoctoral Fellow at the Center for the Mathematics of Information
> California Institute of Technology
> *https://courses2.cit.cornell.edu/mru8 
> *
> (415) 729-4115
>


[julia-users] Re: Remove Gadfly gridlines?

2016-02-01 Thread Alex Mellnik
You are probably overwriting the theme later and the last one overwrites 
that setting.  The following example works for me:




-A


On Monday, February 1, 2016 at 5:58:13 AM UTC-8, Jon Norberg wrote:
>
> I have searched and tried a few things but cannot remove the background 
> grids in Gadfly. Its probably simple and I am missing something 
> obvious...Any suggestions would be appreciated.
>
> layer(x=E,y=wetness(E,10.0), Geom.line,Theme(default_color=a[1], 
> line_width=2pt, grid_color=colorant"white")
>
> also tried grid_line_width=0pt
>
> but they still show up ( I save it as SVG, but also output in jupiter 
> shows them)
>
>

Re: [julia-users] Remove Gadfly gridlines?

2016-02-01 Thread Rob J. Goedman
Thanks Tom,

Gets a bit further. Final subplot(rand(100,2)) works fine.

Regards, Rob

julia> using Plots

julia> gadfly(size=(400,200))
Plots.GadflyPackage()

julia> subplot(rand(100,2), grid=[true, false])
[Plots.jl] Initializing backend: gadfly
ERROR: TypeError: non-boolean (Array{Bool,1}) used in boolean context
 in updateGadflyPlotTheme at 
/Users/rob/.julia/v0.4/Plots/src/backends/gadfly.jl:490
 in _postprocess_subplot at /Users/rob/.julia/v0.4/Plots/src/subplot.jl:270
 in subplot! at /Users/rob/.julia/v0.4/Plots/src/subplot.jl:350
 in subplot at /Users/rob/.julia/v0.4/Plots/src/subplot.jl:186

julia> Pkg.installed("Plots")
v"0.5.1+"

julia> Pkg.installed("Gadfly")
v"0.4.2"

julia> subplot(rand(100,2))

julia> 

> On Feb 1, 2016, at 10:48, Tom Breloff  wrote:
> 
> This should work on master.  Do Pkg.checkout("Plots").  Before that, you 
> would do "gadfly(); default(size=(400,200))"
> 
> On Mon, Feb 1, 2016 at 1:12 PM, Rob J. Goedman  > wrote:
> Hi Tom,
> 
> Which version of Plots are you using?
> 
> Rob
> 
> 
> 
> 
> julia> using Plots
> 
> julia> gadfly(size=(400,200))
> ERROR: ArgumentError: function gadfly does not accept keyword arguments
> 
> julia> Pkg.installed("Plots")
> v"0.5.1"
> 
> julia> versioninfo()
> Julia Version 0.4.3
> Commit a2f713d (2016-01-12 21:37 UTC)
> Platform Info:
>   System: Darwin (x86_64-apple-darwin13.4.0)
>   CPU: Intel(R) Core(TM) i7-4980HQ CPU @ 2.80GHz
>   WORD_SIZE: 64
>   BLAS: libopenblas (USE64BITINT DYNAMIC_ARCH NO_AFFINITY Haswell)
>   LAPACK: libopenblas64_
>   LIBM: libopenlibm
>   LLVM: libLLVM-3.3
> 
> 
>> On Feb 1, 2016, at 06:38, Tom Breloff > > wrote:
>> 
>> It's certainly possible, as I can do this:
>> 
>> 
>> 
>> Looking at my source 
>> (https://github.com/tbreloff/Plots.jl/blob/master/src/backends/gadfly.jl#L490-L492
>>  
>> )
>>  it seems like I'm setting the "grid_color" keyword in the "Gadfly.Theme" 
>> constructor to match the background color.  However that seems to be what 
>> you already tried, so I'm not sure what's different.  I tried this is IJulia 
>> and at the REPL... both results were the same for me.
>> 
>> On Mon, Feb 1, 2016 at 8:58 AM, Jon Norberg > > wrote:
>> I have searched and tried a few things but cannot remove the background 
>> grids in Gadfly. Its probably simple and I am missing something 
>> obvious...Any suggestions would be appreciated.
>> 
>> layer(x=E,y=wetness(E,10.0), Geom.line,Theme(default_color=a[1], 
>> line_width=2pt, grid_color=colorant"white")
>> 
>> also tried grid_line_width=0pt
>> 
>> but they still show up ( I save it as SVG, but also output in jupiter shows 
>> them)
>> 
> 
> 



[julia-users] v0.4.3 Generic Linux binaries on CentOS 6

2016-02-01 Thread daniel . matz
Hello,

In the past, I've been able to download the generic linux binaries for a 
release, copy them into my user directory on my organization's lab, and try 
out Julia without bothering my sys admin.

I just tried getting the v0.4.3 generic binaries, and I'm now getting an 
error when I try to use the package manager.

julia> Pkg.init()
git: error while loading shared libraries: libcrypto.so.6: cannot open 
shared object file: No such file or directory
ERROR: failed process: Process(`git version`, ProcessExited(127)) [127]
 in pipeline_error at process.jl:555
 in readbytes at process.jl:515
 in version at pkg/git.jl:36
 in init at pkg/dir.jl:35
 in init at pkg.jl:19

The system does have libcrypto.so.10.  Is this just a version issue with 
this library?

Thanks!

Daniel


[julia-users] Re: IBM LinuxONE ...

2016-02-01 Thread Páll Haraldsson

In case you do not know, LinuxONE are not x86 machines (that Julia best 
supports):

https://developer.ibm.com/linuxone/resources/faq/
"All the commands such as ls, ps, top, cp, and mv work the same as in an 
X86 Linux system."

At least I'm pretty sure they are only PowerPC, that is only in alpha state 
if that. ARM is already "supported" (with nightly releases), better than 
PowerPC. I didn't follow this closely, PowerPC has huge pages, not the 
usual 4 KB, that seemed to be one issue.

-- 
Palli.

On Saturday, January 30, 2016 at 10:04:08 AM UTC, cdm wrote:
>
>
> https://developer.ibm.com/linuxone/resources/
>
>
> anyone running Julia there yet ... ?
>
> an Ubuntu machine is due to
> be available later in Q1 2016 ...
>


Re: [julia-users] Remove Gadfly gridlines?

2016-02-01 Thread Tom Breloff
You are passing in a vector for the grid argument... it needs to be a
matrix (1 x 2).

In Plots, the arguments are sliced up into columns before building the
plot.  So in my example, the call is similar to:

subplot(plot(rand(100), grid=true), plot(rand(100), grid=false))


however your call is similar to:

subplot(plot(rand(100), grid=[true, false]), plot(rand(100), grid=[true,
> false]))


You're passing a boolean vector to both series, whereas I am passing a
boolean.  Let me know if you need any more explanation.

On Mon, Feb 1, 2016 at 2:33 PM, Rob J. Goedman  wrote:

> Thanks Tom,
>
> Gets a bit further. Final subplot(rand(100,2)) works fine.
>
> Regards, Rob
>
> *julia> **using Plots*
>
> *julia> **gadfly(size=(400,200))*
> *Plots.GadflyPackage()*
>
> *julia> **subplot(rand(100,2), grid=[true, false])*
> [Plots.jl] Initializing backend: gadfly
> *ERROR: TypeError: non-boolean (Array{Bool,1}) used in boolean context*
> * in updateGadflyPlotTheme at
> /Users/rob/.julia/v0.4/Plots/src/backends/gadfly.jl:490*
> * in _postprocess_subplot at
> /Users/rob/.julia/v0.4/Plots/src/subplot.jl:270*
> * in subplot! at /Users/rob/.julia/v0.4/Plots/src/subplot.jl:350*
> * in subplot at /Users/rob/.julia/v0.4/Plots/src/subplot.jl:186*
>
> *julia> **Pkg.installed("Plots")*
> *v"0.5.1+"*
>
> *julia> **Pkg.installed("Gadfly")*
> *v"0.4.2"*
>
> *julia> **subplot(rand(100,2))*
>
> *julia> *
>
> On Feb 1, 2016, at 10:48, Tom Breloff  wrote:
>
> This should work on master.  Do Pkg.checkout("Plots").  Before that, you
> would do "gadfly(); default(size=(400,200))"
>
> On Mon, Feb 1, 2016 at 1:12 PM, Rob J. Goedman  wrote:
>
>> Hi Tom,
>>
>> Which version of Plots are you using?
>>
>> Rob
>>
>> 
>>
>>
>> *julia> **using Plots*
>>
>> *julia> **gadfly(size=(400,200))*
>> *ERROR: ArgumentError: function gadfly does not accept keyword arguments*
>>
>> *julia> **Pkg.installed("Plots")*
>> *v"0.5.1"*
>>
>> *julia> **versioninfo()*
>> Julia Version 0.4.3
>> Commit a2f713d (2016-01-12 21:37 UTC)
>> Platform Info:
>>   System: Darwin (x86_64-apple-darwin13.4.0)
>>   CPU: Intel(R) Core(TM) i7-4980HQ CPU @ 2.80GHz
>>   WORD_SIZE: 64
>>   BLAS: libopenblas (USE64BITINT DYNAMIC_ARCH NO_AFFINITY Haswell)
>>   LAPACK: libopenblas64_
>>   LIBM: libopenlibm
>>   LLVM: libLLVM-3.3
>>
>>
>> On Feb 1, 2016, at 06:38, Tom Breloff  wrote:
>>
>> It's certainly possible, as I can do this:
>>
>> 
>>
>> Looking at my source (
>> https://github.com/tbreloff/Plots.jl/blob/master/src/backends/gadfly.jl#L490-L492)
>> it seems like I'm setting the "grid_color" keyword in the "Gadfly.Theme"
>> constructor to match the background color.  However that seems to be what
>> you already tried, so I'm not sure what's different.  I tried this is
>> IJulia and at the REPL... both results were the same for me.
>>
>> On Mon, Feb 1, 2016 at 8:58 AM, Jon Norberg 
>> wrote:
>>
>>> I have searched and tried a few things but cannot remove the background
>>> grids in Gadfly. Its probably simple and I am missing something
>>> obvious...Any suggestions would be appreciated.
>>>
>>> layer(x=E,y=wetness(E,10.0), Geom.line,Theme(default_color=a[1],
>>> line_width=2pt, grid_color=colorant"white")
>>>
>>> also tried grid_line_width=0pt
>>>
>>> but they still show up ( I save it as SVG, but also output in jupiter
>>> shows them)
>>
>>
>>
>>
>
>


Re: [julia-users] Plotting with Plot

2016-02-01 Thread digxx
Hey,
Thx for your reply...The problem is that apparently using savefig cuts of 
half of the labeling...
So when plotting in my browser say i have the labeling on the right:

Label1
Label2
Label3

then when figuresaving he plots everything fine, but label is just cut of 
in the image: i.e.

Lab
Lab
Lab

and then the picture ends on the right!?
I thought it could be due to some font issue, so Helvetica is narrower?


Re: [julia-users] Plotting with Plot

2016-02-01 Thread digxx
Maybe one dumb question about the juliarc.jl
In what folder should that be?! Apparently I dont have it, does it have to 
contain some basic content?
if I do:

default(tickfont = font(f,8), guidefont = font(f,11), legendfont = 
font(f,8)) > ~./juliarc.jl
it says:
ERROR: syntax: "/" is not a unary operator


Re: [julia-users] Plotting with Plot

2016-02-01 Thread Tom Breloff
Create a file in your home directory (.juliarc.jl) and copy all 4 lines in.

http://docs.julialang.org/en/release-0.4/manual/getting-started/

On Mon, Feb 1, 2016 at 4:47 PM, digxx  wrote:

> Maybe one dumb question about the juliarc.jl
> In what folder should that be?! Apparently I dont have it, does it have to
> contain some basic content?
> if I do:
>
> default(tickfont = font(f,8), guidefont = font(f,11), legendfont =
> font(f,8)) > ~./juliarc.jl
> it says:
> ERROR: syntax: "/" is not a unary operator
>


Re: [julia-users] Plotting with Plot

2016-02-01 Thread Jeffrey Sarnoff
(it is a reasonable question)

If you want to use a .juliarc.jl file (notice the leading '.' in its name), 
create it.
It usually would be placed as 
 (linuxs)   ~/.juliarc.jl 
 (apple)   /home/.juliarc.jl
 (win?)c:\Users\*you*\.juliarc.jl

then put what you want to put inside

On Monday, February 1, 2016 at 4:47:03 PM UTC-5, digxx wrote:
>
> Maybe one dumb question about the juliarc.jl
> In what folder should that be?! Apparently I dont have it, does it have to 
> contain some basic content?
> if I do:
>
> default(tickfont = font(f,8), guidefont = font(f,11), legendfont = 
> font(f,8)) > ~./juliarc.jl
> it says:
> ERROR: syntax: "/" is not a unary operator
>


Re: [julia-users] Parametric Type Question - updating type

2016-02-01 Thread Tom Breloff
Just so you realize... in this example "pricingEngine" has an abstract
type, and you've possibly lost whatever performance gain you were hoping
for in your original question.  To solve you either need to take the same
approach in defining and updating the Bond object, or maybe rethink how
you're doing this.  You should consider utilizing multiple dispatch a
little more keep your PricingEngine and TermStructure separate:

do_something(engine::PricingEngine, term::TermStructure) = ...



On Mon, Feb 1, 2016 at 2:07 PM, Christopher Alexander 
wrote:

> This doesn't seem to work if your PricingEngine object is attached to some
> other object.  Like, for example if you have:
>
> type Bond
>  pricingEngine::PricingEngine
> end
>
>
> myPE = PricingEngine(4.5, 5.5, TermStructureA())
>
>
> myBond = Bond(myPE)
>
>
> myPE = update_ts(myPE, TermStructureB())
>
>
> At that point, myBond's pricing engine still points to the older myPE with
> TermStructureA.
>
> On Monday, February 1, 2016 at 1:58:54 PM UTC-5, Christopher Alexander
> wrote:
>>
>> So something like:
>>
>> function update_ts(pe::PricingEngine, newTS::TermStructure)
>>  newPE = PricingEngine(pe.varA, pe.varB, newTS)
>>  return newPE
>> end
>>
>>
>> myPE = PricingEngine(4.5, 5.5, TermStructureA())
>>
>>
>> myPE = update_ts(myPE, TermStructureB())
>>
>> You probably wouldn't be able to update the "myPE" object in place right
>> (i.e. updating it in the actual update_ts method and then returning itself)?
>>
>>
>> On Monday, February 1, 2016 at 1:50:41 PM UTC-5, Tom Breloff wrote:
>>>
>>> You could just construct a new object with the new TermStructure,
>>> instead of overwriting the old one.
>>>
>>> On Mon, Feb 1, 2016 at 1:27 PM, Christopher Alexander >> > wrote:
>>>
 Hello all, I have a question about the usage of parametric types.  I
 know these bring about a performance boost (at least that was my
 understanding), but I have a situation where I have a parametric type
 defined as such:

 type PricingEngine{T <: TermStructure}
  varA::Float64
  varB::Float64
  ts::T
 end


 But then I need to actually swap the existing term structure with
 another subtype of TermStructure further down the road. Using parametric
 types, it complains because I guess it's locked in to using whatever
 TermStructure sub type is initially there when I instantiate the
 PricingEngine type.  Is there anyway to do such an update while still using
 a type parameter, or am I stuck just with a definition that uses the
 broader abstract type?

 Thanks!

 Chris

>>>
>>>


[julia-users] Re: Anonymous functions now faster? Need for functors?

2016-02-01 Thread colintbowers
Great news! I've been keenly looking forward to this one. A big thanks to 
all the devs involved.

I've updated a few StackOverflow questions on this topic with links to the 
issues page.

Cheers,

Colin

On Sunday, 31 January 2016 23:35:39 UTC+11, Ben Ward wrote:
>
> Hi,
>
> I just saw this merged PR .
>
> In the past I used the functor trick of defining a type, and then a call 
> method for the type, and passing this to a function, to get past the 
> anonymous function inefficiency.
>
> Does this PR mean (to a mortal like me) that on 0.5 now this is no longer 
> necessary?
>
> Thanks,
> Ben.
>


Re: [julia-users] ANN: new blog post on array indexing, iteration, and multidimensional algorithms

2016-02-01 Thread Miguel Bazdresch
Tim,

Thanks for putting this tutorial together, it was an interesting read.

-- mb

On Mon, Feb 1, 2016 at 1:54 PM, Tim Holy  wrote:

> It's come to my attention that some of the exciting capabilities of julia
> 0.4
> for indexing, iteration, and writing generic multidimensional algorithms
> may
> be under-documented. This new blog post
>
> http://julialang.org/blog/2016/02/iteration/
>
> aims to provide a "gentle tutorial" illustrating how to use these
> capabilities.
>
> Best,
> --Tim
>
>


[julia-users] Plotting (Greek letter label, high resolution...)

2016-02-01 Thread digxx
I have a few questions regarding plotting:
1. How do I plot Greek letters? In particular I want to use them as strings 
in a labeling of axes or curves.
2. When using Plot with backend Gadfly how do I savefig with higher 
resolution? So far it produces 600x400
3. How do I print x^* ?? (like it would look in latex) 


[julia-users] Every module recompiles every time?

2016-02-01 Thread Alex Mellnik
At some point since the release of 0.4.3 every module that I use started 
always recompiling whenever I first use it in a new kernel.  If I'm using 
something with lots of dependencies like Gadfly this can lead to 
several-minute startup times.  What determines when modules are recompiled 
and how can I avoid this?  

Two possible things which may be related -- I had a computer crash while 
modules were recompiling around the same time, and I have several packages 
which are either pinned to their master or on my own development branch. 
 Thanks  -A


[julia-users] Re: Plotting (Greek letter label, high resolution...)

2016-02-01 Thread digxx
Sorry just found out about the second...
So still need 1 and 3 :-(


Re: [julia-users] Plotting with Plot

2016-02-01 Thread digxx
So concerning the actual problem. How can I increase the space for the 
legend so that it would show up completly in my plot?


Re: [julia-users] How to unpack a .tar file in Julia?

2016-02-01 Thread Tony Kelman
Check the BinDeps package for unpack_cmd


On Monday, February 1, 2016 at 4:34:13 AM UTC-8, Aslak Grinsted wrote:
>
>
> I have a problem that i do not have tar on my windows machine, and for 
> that reason Pkg.build("GR") fails. I tried fixing it using 7zip as that was 
> present in my julia bin folder by replacing 
>
> run(`tar xzf downloads/$tarball`)
>
> with 
>
> run(pipeline(`7z x -so downloads/$tarball`,`7z x -aoa -si -ttar -o"."`))
>
> --- That worked fine on my windows machine, but fails on travis: 
> https://travis-ci.org/jheinen/GR.jl/jobs/106192456
>
> Any tips are welcome. 
>
>
>
>
>
>
>
>
> On Tuesday, March 10, 2015 at 9:33:06 PM UTC+1, Stefan Karpinski wrote:
>>
>> I would just shell out to the tar command and then work with the untarred 
>> directory.
>>
>> On Tue, Mar 10, 2015 at 4:05 PM, Weijian Zhang  wrote:
>>
>>> Hello,
>>>
>>> I have a .tar.gz file. With GZip.jl, I can write code to unzip it to a 
>>> .tar file.
>>> But how can I unpack this .tar file in Julia?
>>>
>>> Thanks,
>>>
>>> Weijian
>>>
>>>
>>

Re: [julia-users] Parametric Type Question - updating type

2016-02-01 Thread Christopher Alexander
I've been trying to think of a way to implement this, but I'm having some 
trouble.  Perhaps this can be a discussion around organizing objects in 
Julia to best exploit the benefits of multiple dispatch.  Let's say I have 
a Bond type:

type Bond{P <: PricingEngine}
 # some attributes here
 pricingEngine::P
end


Then the particular PricingEngine object for each instantiation of "Bond" 
has its own term structure (perhaps this could be stored with the bond 
itself?).  Each PricingEngine sub type (in my code PricingEngine is 
actually an abstract type itself) has its own calculation method, where 
various components of the bond are calculated (e.g. NPV, etc).  I suppose 
this could be separated out, but I essentially want to provide to the end 
user something like npv(myBond).  The bond knows whether it's been 
calculated or not, and if it hasn't, it does so via its pricing engine. 
 Otherwise, it returns a cached value of its NPV (already having been 
calculated).  If I break all of these things out (bond/instrument, term 
structure, and pricing engine), I would envision a method like this:
npv(bond, pricing_engine, term_structure)

Is there a better/more "Julian" way to organize this?  Perhaps keeping the 
TermStructure separate from everyone and passing it into methods where I 
need it?


On Monday, February 1, 2016 at 5:05:21 PM UTC-5, Tom Breloff wrote:
>
> Just so you realize... in this example "pricingEngine" has an abstract 
> type, and you've possibly lost whatever performance gain you were hoping 
> for in your original question.  To solve you either need to take the same 
> approach in defining and updating the Bond object, or maybe rethink how 
> you're doing this.  You should consider utilizing multiple dispatch a 
> little more keep your PricingEngine and TermStructure separate:
>
> do_something(engine::PricingEngine, term::TermStructure) = ...
>
>
>
> On Mon, Feb 1, 2016 at 2:07 PM, Christopher Alexander  > wrote:
>
>> This doesn't seem to work if your PricingEngine object is attached to 
>> some other object.  Like, for example if you have:
>>
>> type Bond
>>  pricingEngine::PricingEngine
>> end
>>
>>
>> myPE = PricingEngine(4.5, 5.5, TermStructureA())
>>
>>
>> myBond = Bond(myPE)
>>
>>
>> myPE = update_ts(myPE, TermStructureB())
>>
>>
>> At that point, myBond's pricing engine still points to the older myPE 
>> with TermStructureA.
>>
>> On Monday, February 1, 2016 at 1:58:54 PM UTC-5, Christopher Alexander 
>> wrote:
>>>
>>> So something like:
>>>
>>> function update_ts(pe::PricingEngine, newTS::TermStructure)
>>>  newPE = PricingEngine(pe.varA, pe.varB, newTS)
>>>  return newPE
>>> end
>>>
>>>
>>> myPE = PricingEngine(4.5, 5.5, TermStructureA())
>>>
>>>
>>> myPE = update_ts(myPE, TermStructureB())
>>>
>>> You probably wouldn't be able to update the "myPE" object in place right 
>>> (i.e. updating it in the actual update_ts method and then returning itself)?
>>>
>>>
>>> On Monday, February 1, 2016 at 1:50:41 PM UTC-5, Tom Breloff wrote:

 You could just construct a new object with the new TermStructure, 
 instead of overwriting the old one.

 On Mon, Feb 1, 2016 at 1:27 PM, Christopher Alexander <
 uvap...@gmail.com> wrote:

> Hello all, I have a question about the usage of parametric types.  I 
> know these bring about a performance boost (at least that was my 
> understanding), but I have a situation where I have a parametric type 
> defined as such:
>
> type PricingEngine{T <: TermStructure}
>  varA::Float64
>  varB::Float64
>  ts::T
> end
>
>
> But then I need to actually swap the existing term structure with 
> another subtype of TermStructure further down the road. Using parametric 
> types, it complains because I guess it's locked in to using whatever 
> TermStructure sub type is initially there when I instantiate the 
> PricingEngine type.  Is there anyway to do such an update while still 
> using 
> a type parameter, or am I stuck just with a definition that uses the 
> broader abstract type?
>
> Thanks!
>
> Chris
>


>

[julia-users] Are there solutions for tuple/fixed-length vector arithmetic?

2016-02-01 Thread Nathan Baum
In some code I'm writing I have inner loops which, amongst other things, 
translate spherical coordinates into Cartesian.

I notice that the inner loop is a *lot* faster if I change the code to use 
tuples. This makes sense, but when I use tuples I can't use the maths 
operators and functions that I can with vectors.

Is there some package which provides these functions for tuples or, 
alternatively, which offers alternative implementations of efficient 
fixed-length vectors?

The code is

rll2xyz(r, θ, ϕ) = let cosθ = cos(θ)
  [r * cosθ * cos(ϕ), r * cosθ * sin(ϕ), r * sin(θ)] # slower version
end


vs

rll2xyz(r, θ, ϕ) = let cosθ = cos(θ)
  (r * cosθ * cos(ϕ), r * cosθ * sin(ϕ), r * sin(θ)) # faster version
end




[julia-users] Re: v0.4.3 Generic Linux binaries on CentOS 6

2016-02-01 Thread Tony Kelman
Where are you getting git from? Can you use git outside of Julia on this 
system? On Julia 0.4 we do not include git in the Linux binaries.


On Monday, February 1, 2016 at 12:40:28 PM UTC-8, danie...@gmail.com wrote:
>
> Hello,
>
> In the past, I've been able to download the generic linux binaries for a 
> release, copy them into my user directory on my organization's lab, and try 
> out Julia without bothering my sys admin.
>
> I just tried getting the v0.4.3 generic binaries, and I'm now getting an 
> error when I try to use the package manager.
>
> julia> Pkg.init()
> git: error while loading shared libraries: libcrypto.so.6: cannot open 
> shared object file: No such file or directory
> ERROR: failed process: Process(`git version`, ProcessExited(127)) [127]
>  in pipeline_error at process.jl:555
>  in readbytes at process.jl:515
>  in version at pkg/git.jl:36
>  in init at pkg/dir.jl:35
>  in init at pkg.jl:19
>
> The system does have libcrypto.so.10.  Is this just a version issue with 
> this library?
>
> Thanks!
>
> Daniel
>


Re: [julia-users] Remove Gadfly gridlines?

2016-02-01 Thread Rob J. Goedman
Thanks Tom, that works fine.

> On Feb 1, 2016, at 13:15, Tom Breloff  wrote:
> 
> You are passing in a vector for the grid argument... it needs to be a matrix 
> (1 x 2).
> 
> In Plots, the arguments are sliced up into columns before building the plot.  
> So in my example, the call is similar to:
> 
> subplot(plot(rand(100), grid=true), plot(rand(100), grid=false))
> 
> however your call is similar to:
> 
> subplot(plot(rand(100), grid=[true, false]), plot(rand(100), grid=[true, 
> false]))
> 
> You're passing a boolean vector to both series, whereas I am passing a 
> boolean.  Let me know if you need any more explanation.
> 
> On Mon, Feb 1, 2016 at 2:33 PM, Rob J. Goedman  > wrote:
> Thanks Tom,
> 
> Gets a bit further. Final subplot(rand(100,2)) works fine.
> 
> Regards, Rob
> 
> julia> using Plots
> 
> julia> gadfly(size=(400,200))
> Plots.GadflyPackage()
> 
> julia> subplot(rand(100,2), grid=[true, false])
> [Plots.jl] Initializing backend: gadfly
> ERROR: TypeError: non-boolean (Array{Bool,1}) used in boolean context
>  in updateGadflyPlotTheme at 
> /Users/rob/.julia/v0.4/Plots/src/backends/gadfly.jl:490
>  in _postprocess_subplot at /Users/rob/.julia/v0.4/Plots/src/subplot.jl:270
>  in subplot! at /Users/rob/.julia/v0.4/Plots/src/subplot.jl:350
>  in subplot at /Users/rob/.julia/v0.4/Plots/src/subplot.jl:186
> 
> julia> Pkg.installed("Plots")
> v"0.5.1+"
> 
> julia> Pkg.installed("Gadfly")
> v"0.4.2"
> 
> julia> subplot(rand(100,2))
> 
> julia> 
> 
>> On Feb 1, 2016, at 10:48, Tom Breloff > > wrote:
>> 
>> This should work on master.  Do Pkg.checkout("Plots").  Before that, you 
>> would do "gadfly(); default(size=(400,200))"
>> 
>> On Mon, Feb 1, 2016 at 1:12 PM, Rob J. Goedman > > wrote:
>> Hi Tom,
>> 
>> Which version of Plots are you using?
>> 
>> Rob
>> 
>> 
>> 
>> 
>> julia> using Plots
>> 
>> julia> gadfly(size=(400,200))
>> ERROR: ArgumentError: function gadfly does not accept keyword arguments
>> 
>> julia> Pkg.installed("Plots")
>> v"0.5.1"
>> 
>> julia> versioninfo()
>> Julia Version 0.4.3
>> Commit a2f713d (2016-01-12 21:37 UTC)
>> Platform Info:
>>   System: Darwin (x86_64-apple-darwin13.4.0)
>>   CPU: Intel(R) Core(TM) i7-4980HQ CPU @ 2.80GHz
>>   WORD_SIZE: 64
>>   BLAS: libopenblas (USE64BITINT DYNAMIC_ARCH NO_AFFINITY Haswell)
>>   LAPACK: libopenblas64_
>>   LIBM: libopenlibm
>>   LLVM: libLLVM-3.3
>> 
>> 
>>> On Feb 1, 2016, at 06:38, Tom Breloff >> > wrote:
>>> 
>>> It's certainly possible, as I can do this:
>>> 
>>> 
>>> 
>>> Looking at my source 
>>> (https://github.com/tbreloff/Plots.jl/blob/master/src/backends/gadfly.jl#L490-L492
>>>  
>>> )
>>>  it seems like I'm setting the "grid_color" keyword in the "Gadfly.Theme" 
>>> constructor to match the background color.  However that seems to be what 
>>> you already tried, so I'm not sure what's different.  I tried this is 
>>> IJulia and at the REPL... both results were the same for me.
>>> 
>>> On Mon, Feb 1, 2016 at 8:58 AM, Jon Norberg >> > wrote:
>>> I have searched and tried a few things but cannot remove the background 
>>> grids in Gadfly. Its probably simple and I am missing something 
>>> obvious...Any suggestions would be appreciated.
>>> 
>>> layer(x=E,y=wetness(E,10.0), Geom.line,Theme(default_color=a[1], 
>>> line_width=2pt, grid_color=colorant"white")
>>> 
>>> also tried grid_line_width=0pt
>>> 
>>> but they still show up ( I save it as SVG, but also output in jupiter shows 
>>> them)
>>> 
>> 
>> 
> 
> 



[julia-users] Re: Anonymous functions now faster? Need for functors?

2016-02-01 Thread Eric Forgy
I'd love to understand this better. I've read the comments in the merged PR 
, but a blog post or 
pointers to new docs would be great.



[julia-users] Is the Actor model for parallelism a good paradigm for Julia

2016-02-01 Thread Lyndon White
Hi,

So I do a lot of batch processing, for machine learning.
I have a lot of RAM, 45Gb, and 12 cores.

My normal method for parallel processing is to replicate any common shared 
read-only memory across all workers, using @everywhere.
Then process my data with pmap, or a variation of my own pmapreduce.

On my work yesterday, I couldnot replicate my common shared memory across 
all workers, as it was too large (17Gb).
So I left it on processor 1, and told the workers to do a remotecall_fetch 
to retrieve it.
This seems to have worked very well, as the workers quickly get out of sync 
so beg from processor 1 at different times.
And processor 1 is normally unused til the worker are all done anyway.

I was thinking about it after I went home, and realised that this was a 
very rough approximation of the Actor model
(If I am recalling the Actor 
model correctly).

What I am thinking,
is I could have 1 worker per service -- where a service is combination of 
data and functions that operate on it.
Which is more than 1 worker per core.
When ever a worker needs that data, it remotecall_fetches the function on 
the services worker.
(Potentially it does smarter things than a remotecall_fetch, so it is 
unblocking.)


Is this sensible?



[julia-users] Re: Are there solutions for tuple/fixed-length vector arithmetic?

2016-02-01 Thread Simon Danisch
There is this: https://github.com/SimonDanisch/FixedSizeArrays.jl

Best,
Simon


Am Dienstag, 2. Februar 2016 01:07:11 UTC+1 schrieb Nathan Baum:
>
> In some code I'm writing I have inner loops which, amongst other things, 
> translate spherical coordinates into Cartesian.
>
> I notice that the inner loop is a *lot* faster if I change the code to 
> use tuples. This makes sense, but when I use tuples I can't use the maths 
> operators and functions that I can with vectors.
>
> Is there some package which provides these functions for tuples or, 
> alternatively, which offers alternative implementations of efficient 
> fixed-length vectors?
>
> The code is
>
> rll2xyz(r, θ, ϕ) = let cosθ = cos(θ)
>   [r * cosθ * cos(ϕ), r * cosθ * sin(ϕ), r * sin(θ)] # slower version
> end
>
>
> vs
>
> rll2xyz(r, θ, ϕ) = let cosθ = cos(θ)
>   (r * cosθ * cos(ϕ), r * cosθ * sin(ϕ), r * sin(θ)) # faster version
> end
>
>
>

Re: [julia-users] Memory management in Julia

2016-02-01 Thread Madeleine Udell
Fantastic! That helped me track down the problem and it's working now.
On Feb 1, 2016 11:10 AM, "Matt Bauman"  wrote:

> Can you reproduce it if you run julia with the --check-bounds=yes command
> line argument?
>
> On Monday, February 1, 2016 at 1:55:25 PM UTC-5, Madeleine Udell wrote:
>>
>> This is on a mac; we've got a variety of function calls giving errors,
>> some with probability ~.5 and some every time we've run them.
>>
>> On Mon, Feb 1, 2016 at 10:51 AM, Yichao Yu  wrote:
>>
>>> On Mon, Feb 1, 2016 at 1:39 PM, Madeleine Udell
>>>  wrote:
>>> > Hi all,
>>> >
>>> > I'm running into some memory management issues: in particular, a malloc
>>> > error that claims I am modifying an object after freeing it: see this
>>> > question. The error is,
>>> >
>>> > julia(9849,0x7fff705d0300) malloc: *** error for object 0x7f96a332f408:
>>> > incorrect checksum for freed object - object was probably modified
>>> after
>>> > being freed. *** set a breakpoint in malloc_error_break to debug
>>> >
>>> > I'm not sure how to debug it: what's the best way to search for code
>>> that
>>> > might be modifying an object after freeing it in Julia? (For example, I
>>> > don't know what or where malloc_error_break is.)
>>>
>>> Which platform is this and how repeatable is it?
>>>
>>> >
>>> > Thanks!
>>> > Madeleine
>>>
>>
>>
>>
>> --
>> Madeleine Udell
>> Postdoctoral Fellow at the Center for the Mathematics of Information
>> California Institute of Technology
>> *https://courses2.cit.cornell.edu/mru8
>> *
>> (415) 729-4115
>>
>


[julia-users] Re: Are there solutions for tuple/fixed-length vector arithmetic?

2016-02-01 Thread Nathan Baum
I'm surprised that didn't turn up in my Googling.

It looks like exactly what I want.

Thanks.

On Tuesday, 2 February 2016 01:01:47 UTC, Simon Danisch wrote:
>
> There is this: https://github.com/SimonDanisch/FixedSizeArrays.jl
>
> Best,
> Simon
>
>
> Am Dienstag, 2. Februar 2016 01:07:11 UTC+1 schrieb Nathan Baum:
>>
>> In some code I'm writing I have inner loops which, amongst other things, 
>> translate spherical coordinates into Cartesian.
>>
>> I notice that the inner loop is a *lot* faster if I change the code to 
>> use tuples. This makes sense, but when I use tuples I can't use the maths 
>> operators and functions that I can with vectors.
>>
>> Is there some package which provides these functions for tuples or, 
>> alternatively, which offers alternative implementations of efficient 
>> fixed-length vectors?
>>
>> The code is
>>
>> rll2xyz(r, θ, ϕ) = let cosθ = cos(θ)
>>   [r * cosθ * cos(ϕ), r * cosθ * sin(ϕ), r * sin(θ)] # slower version
>> end
>>
>>
>> vs
>>
>> rll2xyz(r, θ, ϕ) = let cosθ = cos(θ)
>>   (r * cosθ * cos(ϕ), r * cosθ * sin(ϕ), r * sin(θ)) # faster version
>> end
>>
>>
>>

Re: [julia-users] Re: Sparse matrix memory preallocation?

2016-02-01 Thread Gabriel Goh
excellent point your function was indeed about 4x faster, but Kristophers 
was about 100x :O so i'll stick to his

On Monday, February 1, 2016 at 6:56:56 AM UTC-8, Erik Schnetter wrote:
>
> Compare this function: 
>
> ```Julia 
> function f2(k) 
> M = spzeros(2*k,2*k) 
> for i = 1:k 
> j1 = (i-1)*2+1 
> j2 = i*2 
> M[j1,j1] = rand() 
> M[j2,j1] = rand() 
> M[j1,j2] = rand() 
> M[j2,j2] = rand() 
> end 
> return M 
> end 
> ``` 
> which is much faster. It seems your original code has two performance 
> issues that are unrelated to sparse matrix memory allocation: 
> (1) `randn` allocates a new matrix every time 
> (2) Something about indexing sparse matrices with ranges seems slow (I 
> don't know why) 
>
> If you want to continue to use `randn`, then you can use `randn!` 
> instead, and preallocate the small matrix outside the loop. 
>
> -erik 
>
>
> On Mon, Feb 1, 2016 at 9:42 AM, Kristoffer Carlsson 
> > wrote: 
> >> However  doing in this way is more cumbersome because you need to have 
> a 
> >> good estimate of the number of entries 
> > 
> > Not true. The difference between pre allocating the arrays and just 
> pushing 
> > into them is not that large due to how julia arrays work (constant 
> > ammortized time etc). 
> > 
> > 
> > On Monday, February 1, 2016 at 1:34:12 PM UTC+1, alan souza wrote: 
> >> 
> >> You could try to use the triplet form (tree vectors containing the 
> row/col 
> >> indexes and the value of the entry) and call the function sparse. 
> >> In this way you can preallocate in advance these three vectors. 
> >> However  doing in this way is more cumbersome because you need to have 
> a 
> >> good estimate of the number of entries and to explicitly calculate the 
> index 
> >> for all entries. 
> >> 
> >> On Sunday, January 31, 2016 at 8:07:56 PM UTC-2, Gabriel Goh wrote: 
> >>> 
> >>> Generating a sparse matrix from scratch seems to be quite memory 
> >>> intensive. and slow. Say I wish to create a large block diagonal 
> matrix with 
> >>> 2x2 block entries. 
> >>> 
> >>> Doing it naively is quite slow 
> >>> 
> >>> function f(k) 
> >>>   M = spzeros(2*k,2*k) 
> >>>   for i = 1:k 
> >>> D = (i-1)*2 + 1:i*2 
> >>> M[D,D] = randn(2,2) 
> >>>   end 
> >>>   return M 
> >>> end 
> >>> 
> >>> julia> @time f(1) 
> >>> 2.534277 seconds (239.26 k allocations: 3.013 GB, 15.58% gc time) 
> >>> 
> >>> Is there a way to speed this up by preallocating the memory somehow? 
>
>
>
> -- 
> Erik Schnetter > 
> http://www.perimeterinstitute.ca/personal/eschnetter/ 
>


[julia-users] Re: Sparse matrix memory preallocation?

2016-02-01 Thread Gabriel Goh
Thanks! This triplet solution was a miracle

On Monday, February 1, 2016 at 1:46:27 AM UTC-8, Kristoffer Carlsson wrote:
>
> At computer now.
>
> Something like this:
>
> function f(k)
> I, J, V = Int[], Int[], Float64[]
> for i = 1:k
> idxs = (i-1)*2 + 1:i*2
> for i in idxs, j in idxs
> push!(I, i)
> push!(J, j)
> push!(V, rand())
> end
> end
> return sparse(I,J,V)
> end
>
> @time f(1)
> 0.001932 seconds (71 allocations: 4.986 MB)
>
>
>
> On Monday, February 1, 2016 at 9:25:22 AM UTC+1, Kristoffer Carlsson wrote:
>>
>> Create the vectors I J V which holds the nonzero rows, columns and values 
>> respectively and then call sparse(I, J, V). 
>
>

[julia-users] am i prepared for the arraypocalypse?

2016-02-01 Thread Gabriel Goh
my tests run smoothly in the current build of julia 0.5 beta. Is everything 
good or are there breaking changes to come?


[julia-users] Signature for function accepting array of parameterised types

2016-02-01 Thread Samuel Powell
Hi,

Consider the following:

abstract TypeA

type Type1 <: TypeA
end

type Type2 <: TypeA
end

type Type3{T<:TypeA}
prof::T
end

function fun(arr::Array{Type3, 1})
end

t1 = Type1()
t2 = Type2()

t3_1 = Type3(t1)
t3_2 = Type3(t2)

fun([t3_1; t3_2]) # This is fine
fun([t3_1; t3_1]) # This fails with a no method error
fun([t3_2; t3_2]) # This fails with a no method error


What function signature will allow all three calls to dispatch without 
error?

Regards,

Sam.





[julia-users] Function signature for array of parameterised types

2016-02-01 Thread Samuel Powell
Hi,

Consider the following:

abstract TypeA

type Type1 <: TypeA
end

type Type2 <: TypeA
end

type Type3{T<:TypeA}
prof::T
end

function fun(arr::Array{Type3, 1})
end

t1 = Type1()
t2 = Type2()

t3_1 = Type3(t1)
t3_2 = Type3(t2)


fun([t3_1; t3_2])
fun([t3_1; t3_1])
fun([t3_2; t3_2])



[julia-users] Juno IDE

2016-02-01 Thread David Blake
Hi guys, some help please.

A while back I downloaded and installed Julia Studio and wrote 10-20 little 
programs in it.  I found it quite good but now it's been discontinued of 
course.

So I'm looking at Juno, but man I find it hard to use.  I've read a bit on 
here about it, but I still feel like I have no idea what I'm doing.  I'd 
very much appreciate some help with this:

These are pretty basic questions, so please don't flame me.  I'm on Windows 
10, 64 bit.

1) Every time I start Juno, it tells me a new binary version of LightTable 
(LT) is available and do I want to download it.  It doesn't sort of update 
automatically, just opens a link to the download site for LT.  So then I'm 
unclear as to what to do, I can download the LT binary but then what?  I 
have Juno, which is on top of LT, how to upgrade the underlying LT 
version?  Or should I just not worry about it?

2) Also, how would I upgrade the underlying Julia language to the latest 
version please?  In another site, I saw how to use versioninfo, it shows 
3.10.

3) There seem to be very few commands available via the menu, but lots and 
lots via Ctrl-space.  I find this quite different to most IDEs.  Is this 
just the way LT works? And just a matter of getting used to it?  If so, I'm 
OK with that.

4) The workflow pattern I normally like to use with other languages like 
Python is to write my code in scripts and then run from a console, 
preferably all within an IDE.  So I use Spyder for Python and find it very 
good.  I'd like to use Juno the same way.  As opposed to say having a text 
editor open to code in, and a separate console window to run files from 
etc.  Do people use Juno like this? i.e. like a standalone thing?

Any help appreciated.



[julia-users] UInt8 in hist()

2016-02-01 Thread n . sugimura
Hi, I'm newbie.

I'm trying to make histogram for graphics.
BMP data is made with UInt8, so I threw UInt8 array to hist().
It does not work properly.
I suppose comparing A and B may be confused in hist().
So what shall I do?


There is a sample code. rand(UInt8, 1000) * 1.0 may work fine.

# create data
data1d= rand(UInt8, 1000)
(edges, counts)= hist(data1d, -1:255)
# draw Histogram
using Winston
h= Histogram(edges, counts)
p= FramedPlot()
add(p, h)


[julia-users] Re: Need Ref on UInt32 but not Float64

2016-02-01 Thread Bryan Rivera
I figured this out. 

Is using Ref in such a way be acceptable?


cfunction(callbackfun, Void, (Ptr{UInt8}, UInt32, Ref{SomeConcreteType}))






[julia-users] Re: Need Ref on UInt32 but not Float64

2016-02-01 Thread Bryan Rivera
I figured it out. 

Is it ok to use Ref this way?


cfunction(callbackfun, Void, (Ptr{UInt8}, UInt32, Ref{SomeConcreteType}))




[julia-users] Re: Juno IDE

2016-02-01 Thread Bryan Rivera
I've seen some enthusiasm about Juno, but I think we might be reinventing 
the wheel here.

Atom's approach using web technologies really appeals to me.

It doesn't have all the features of IntelliJ, but it has the potential to.


Re: [julia-users] Signature for function accepting array of parameterised types

2016-02-01 Thread Mauro
On Mon, 2016-02-01 at 18:29, Samuel Powell  wrote:
> Hi,
>
> Consider the following:
>
> abstract TypeA
>
> type Type1 <: TypeA
> end
>
> type Type2 <: TypeA
> end
>
> type Type3{T<:TypeA}
> prof::T
> end
>
> function fun(arr::Array{Type3, 1})
> end
>
> t1 = Type1()
> t2 = Type2()
>
> t3_1 = Type3(t1)
> t3_2 = Type3(t2)
>
> fun([t3_1; t3_2]) # This is fine
> fun([t3_1; t3_1]) # This fails with a no method error
> fun([t3_2; t3_2]) # This fails with a no method error

This is because of invariance (search the doc or julia-users for it):

julia> Array{Type3{Type1}}<:Array{Type3}
false

This works:

julia> function fun{T<:Type3}(arr::Array{T, 1})
   end
fun (generic function with 2 methods)

julia> fun([t3_1; t3_1])


Re: [julia-users] am i prepared for the arraypocalypse?

2016-02-01 Thread Mauro
> my tests run smoothly in the current build of julia 0.5 beta. Is everything
> good or are there breaking changes to come?

There are always breaking changes to come on Master... that's the point
of Master.   Expect more to come, as the arraypocalypse has not happened
yet (nor it is quite clear how it will happen).


[julia-users] julia on ARM for a drone

2016-02-01 Thread Jeff Waller
Does update last month 

 and 
the discussion about arm nightlies  
mean that 
ARM is generally supported now?

I have a specific purpose in mind that doubles down on the difficulty, 
however.  The install method need not
be by RPM, but though the processor is an ARMv7,  it's more 
specifically Cortex-A9, and is not related to Redhat
or Rasperry, but rather the OS is Yocto Linux 
; which is an interesting project -- it's a 
distro built from the result
of a cross-compilation framework paired with a bunch of metadata describing 
the processor.  The idea is that
if one can describe the processor, then the distro can be tailored to it.

Trying to get julia compiled within the openembedded framework 
 might be a good proj itself, but my specific 
purpose
is because that distro is used by this drone , 
and I feel that (would like that) Julia could well play a part in the 
control
software.

You know, any sort of tips would help; hey I just try one of the nightlies 
and see what happens.


[julia-users] Crashing REPL experimenting for macro aliases

2016-02-01 Thread Leonardo
Hello,
I'm looking for a way to create a macro aliases (at now, without success)

I've defined simple macro:
macro m(ex)
  dump(ex)
end

and command 
dump(:(@m(1+2)))
returns:
Expr
  head: Symbol macrocall
  args: Array(Any,(2,))
1: Symbol @m
2: Expr
  head: Symbol call
  args: Array(Any,(3,))
1: Symbol +
2: Int64 1
3: Int64 2
  typ: Any
  typ: Any

then I've simulated a call to macro m with:
eval(Expr(:macrocall,[symbol("@m"),:(1+2)]))

but this cause a crash of REPL:
Please submit a bug report with steps to reproduce this fault, and any 
error mes
sages that follow (in their entirety). Thanks.
Exception: EXCEPTION_ACCESS_VIOLATION at 0x82792b10 -- unknown function 
(ip: 000
082792B10)
unknown function (ip: 82792B10)
(I use Julia 0.4.3 under Win64 platform)

BTW, if anyone can suggest me how to define a macro alias ...

Many thanks

Leonardo



Re: [julia-users] Parametric Type Question - updating type

2016-02-01 Thread Mauro
I haven't been following this thread closely, so I have no frame of
reference here, I'm like a child who wanders into the middle of a
julia-users thread...

A few observations:

- functions being called with composite types which have some
  non-concrete fields suffer no performance penalty, as long as you
  don't use those non-concrete fields
- if using a non-concrete field, you can use kernel
  functions  (aka barrier funciton):
  
http://docs.julialang.org/en/release-0.4/manual/performance-tips/#separate-kernel-functions
  The kernel fn could be what Tom suggested

nvp(myBond) = isdefined(myBond, :nvp_val) ? myBond.nvp_val :_nvp!(myBond, 
myBond.pricingEngine)

_nvp!(bond, pe:PricingEngine1) = ... # writes to nvp_val field and returns it
_nvp!(bond, pe:PricingEngine2) = ...

On Tue, 2016-02-02 at 00:41, Christopher Alexander  wrote:
> I've been trying to think of a way to implement this, but I'm having some
> trouble.  Perhaps this can be a discussion around organizing objects in
> Julia to best exploit the benefits of multiple dispatch.  Let's say I have
> a Bond type:
>
> type Bond{P <: PricingEngine}
>  # some attributes here
>  pricingEngine::P
> end
>
>
> Then the particular PricingEngine object for each instantiation of "Bond"
> has its own term structure (perhaps this could be stored with the bond
> itself?).  Each PricingEngine sub type (in my code PricingEngine is
> actually an abstract type itself) has its own calculation method, where
> various components of the bond are calculated (e.g. NPV, etc).  I suppose
> this could be separated out, but I essentially want to provide to the end
> user something like npv(myBond).  The bond knows whether it's been
> calculated or not, and if it hasn't, it does so via its pricing engine.
>  Otherwise, it returns a cached value of its NPV (already having been
> calculated).  If I break all of these things out (bond/instrument, term
> structure, and pricing engine), I would envision a method like this:
> npv(bond, pricing_engine, term_structure)
>
> Is there a better/more "Julian" way to organize this?  Perhaps keeping the
> TermStructure separate from everyone and passing it into methods where I
> need it?
>
>
> On Monday, February 1, 2016 at 5:05:21 PM UTC-5, Tom Breloff wrote:
>>
>> Just so you realize... in this example "pricingEngine" has an abstract
>> type, and you've possibly lost whatever performance gain you were hoping
>> for in your original question.  To solve you either need to take the same
>> approach in defining and updating the Bond object, or maybe rethink how
>> you're doing this.  You should consider utilizing multiple dispatch a
>> little more keep your PricingEngine and TermStructure separate:
>>
>> do_something(engine::PricingEngine, term::TermStructure) = ...
>>
>>
>>
>> On Mon, Feb 1, 2016 at 2:07 PM, Christopher Alexander > > wrote:
>>
>>> This doesn't seem to work if your PricingEngine object is attached to
>>> some other object.  Like, for example if you have:
>>>
>>> type Bond
>>>  pricingEngine::PricingEngine
>>> end
>>>
>>>
>>> myPE = PricingEngine(4.5, 5.5, TermStructureA())
>>>
>>>
>>> myBond = Bond(myPE)
>>>
>>>
>>> myPE = update_ts(myPE, TermStructureB())
>>>
>>>
>>> At that point, myBond's pricing engine still points to the older myPE
>>> with TermStructureA.
>>>
>>> On Monday, February 1, 2016 at 1:58:54 PM UTC-5, Christopher Alexander
>>> wrote:

 So something like:

 function update_ts(pe::PricingEngine, newTS::TermStructure)
  newPE = PricingEngine(pe.varA, pe.varB, newTS)
  return newPE
 end


 myPE = PricingEngine(4.5, 5.5, TermStructureA())


 myPE = update_ts(myPE, TermStructureB())

 You probably wouldn't be able to update the "myPE" object in place right
 (i.e. updating it in the actual update_ts method and then returning 
 itself)?


 On Monday, February 1, 2016 at 1:50:41 PM UTC-5, Tom Breloff wrote:
>
> You could just construct a new object with the new TermStructure,
> instead of overwriting the old one.
>
> On Mon, Feb 1, 2016 at 1:27 PM, Christopher Alexander <
> uvap...@gmail.com> wrote:
>
>> Hello all, I have a question about the usage of parametric types.  I
>> know these bring about a performance boost (at least that was my
>> understanding), but I have a situation where I have a parametric type
>> defined as such:
>>
>> type PricingEngine{T <: TermStructure}
>>  varA::Float64
>>  varB::Float64
>>  ts::T
>> end
>>
>>
>> But then I need to actually swap the existing term structure with
>> another subtype of TermStructure further down the road. Using parametric
>> types, it complains because I guess it's locked in to using whatever
>> TermStructure sub type is initially there when I instantiate the
>> PricingEngine type.  Is there anyway to do such an update while still 
>> using
>> a type parameter, or am I stuck just with

Re: [julia-users] am i prepared for the arraypocalypse?

2016-02-01 Thread Christopher Alexander
What is the arraypocalypse?

On Tuesday, February 2, 2016 at 1:28:26 AM UTC-5, Mauro wrote:
>
> > my tests run smoothly in the current build of julia 0.5 beta. Is 
> everything 
> > good or are there breaking changes to come? 
>
> There are always breaking changes to come on Master... that's the point 
> of Master.   Expect more to come, as the arraypocalypse has not happened 
> yet (nor it is quite clear how it will happen). 
>


[julia-users] Re: Juno IDE

2016-02-01 Thread David Blake
Bryan, I agree.  IDEs have been around for decades now.  I would have 
thought they could just hook Julia into a well established one.

An IDE must be a huge thing to build from scratch, or even with an 
underlying text editor.

On Tuesday, 2 February 2016 18:37:40 UTC+13, Bryan Rivera wrote:
>
> I've seen some enthusiasm about Juno, but I think we might be reinventing 
> the wheel here.
>
> Atom's approach using web technologies really appeals to me.
>
> It doesn't have all the features of IntelliJ, but it has the potential to.
>