[julia-users] Re: newest stable version of julia on EC2 AWS instances - starcluster

2016-09-09 Thread Michael Prentiss
I do not know an answer, but you may want  to use the amazon cluster 
generator (cfncluster).  It is under constant development, unlike 
starcluster.  It may be easier. 

https://github.com/awslabs/cfncluster


On Friday, September 9, 2016 at 7:21:33 PM UTC-5, Alexandros Fakos wrote:
>
> Hi,
>
> I am trying to run julia in parallel on EC2 AWS. I use the starcluster 
> package to create a cluster of instances.
>
> The problem is that the instances have already installed the 0.3.0 
> prerelease. 
>
> I log in the starcluster master node and use the instructions from 
> http://julialang.org/downloads/platform.html 
>
> sudo add-apt-repository ppa:staticfloat/juliareleases
> sudo add-apt-repository ppa:staticfloat/julia-deps
> sudo apt-get update   (this command is giving me errors: W:Failed to fetch 
> http://us-west-1.ec2.archive.ubuntu.com/ubuntu/distrs/raring etc. and 
> other similar messages)
> sudo apt-get install julia
>
> Unfortunately, when  I type  julia in the terminal the 0.3.0 prerelease 
> comes up. 
>
> Note that in my regular EC2 instance the above commands work and install 
> julia 0.4.6
>
> As a general question, how can I launch instances with the latest stable 
> version of julia preinstalled? If I cannot do that, how can I install the 
> latest version of julia in all my cluster nodes?
>
> Thanks a lot,
> Alex
>


[julia-users] Re: Plasma Actuator Simulation

2015-09-28 Thread Michael Prentiss
In my experience it is helpful to have a Fortran version to compare speeds 
against when learning Julia.
This would be easy w.r.t. your problem.

It is easy to right functional Julia code that is very slow, but looks 
okay.  The best way for me was to have 
have a basis of comparison.

On Monday, September 28, 2015 at 9:14:32 PM UTC-5, Marius wrote:
>
> Hi everybody,
>
> I wrote a simulation code for Plasma Actuator. This is based on my 
> previous code written in Python and uses Suzen model combined with 
> Navier-Stokes equations. It's about 5 times faster than Python but still 
> simulation takes days to finish. I am new to Julia and I tried to follow 
> the performance tips but so far this is how much I could do. Any 
> suggestions for improving speed are highly appreciated. This is the code:
>
>
> using PyPlot
> using PyCall
>
> const nx=441
> const ny=441
> const nt=100
> const Length=0.011#11 mm
> const dx=Length/(nx-1)
> const dy=Length/(ny-1)
> x=linspace(-Length/2,Length/2,nx)
> y=linspace(-Length/2,Length/2,ny)
> 
> phi1 = zeros(ny,nx) #potential
> eps=zeros(ny,nx)#permitivity
> roc=ones(ny,nx)#charge
> ld=ones(ny,nx)#Debye length
> ### All Electrodes 
> # initial boundary conditions
> phi1[221:222,49:57]=1400.0
> phi1[218:219,57:97]=0.0
> #phi1[221:222,97:105]=0.0
> phi1[221:222,145:153]=1400.0
> phi1[218:219,153:193]=0.0
> #phi1[221:222,193:201]=0.0
> #phi1[221:222,241:249]=0.0
> phi1[218:219,249:289]=0.0
> phi1[221:222,289:297]=1400.0
> #phi1[221:222,337:345]=0.0
> phi1[218:219,345:385]=0.0
> phi1[221:222,385:393]=1400.0
> eps[1:220,:]=2.7
> eps[221:end,:]=1.0
>
> eps[221:221,:]=(eps[222:222,1:end].*eps[220:220,1:end])./(((dx/(2*dx))*eps[222:222,1:end])+
> (((dx/(2*dx))*eps[220:220,1:end])))
> ###
> #convergence parameters
> to1=1e-8
> ###Gauss-Seidel solver
> max_phi1_diff=1
> while max_phi1_diff > to1
>   phi1_old=copy(phi1)
>   phi1[2:nx-1,2:ny-1]=((eps[2:nx-1,2:ny-1].*((phi1[3:nx,2:ny-1]/(dx^2))+
> 
> (phi1[2:nx-1,3:ny]/(dy^2)))+((eps[1:nx-2,2:ny-1].*phi1[1:nx-2,2:ny-1])/(dx^2))+
> 
> ((eps[2:nx-1,1:ny-2].*phi1[2:nx-1,1:ny-2]/(dy^2)./(((eps[2:nx-1,2:ny-1]+
> eps[1:nx-2,2:ny-1])/(dx^2))+
> ((eps[2:nx-1,2:ny-1]+eps[2:nx-1,1:ny-2])/(dy^2)))
>   phi1[end,:]=phi1[end-1,:]
>   phi1[1,:]=phi1[2,:]
>   phi1[:,end]=phi1[:,end-1]
>   phi1[:,1]=phi1[:,2]
>   phi1[221:222,49:57]=1400.0
>   phi1[218:219,57:97]=0.0
>   #phi1[221:222,97:105]=0.0
>   phi1[221:222,145:153]=1400.0
>   phi1[218:219,153:193]=0.0
>   #phi1[221:222,193:201]=0.0
>   #phi1[221:222,241:249]=0.0
>   phi1[218:219,249:289]=0.0
>   phi1[221:222,289:297]=1400.0
>   #phi1[221:222,337:345]=0.0
>   phi1[218:219,345:385]=0.0
>   phi1[221:222,385:393]=1400.0
>   eps[1:220,:]=2.7
>   eps[221:end,:]=1.0
>   phi1_diff = phi1.-phi1_old
>   max_phi1_diff=maximum(abs(phi1_diff))
> end
>
> fig = plt.figure(figsize = (11,7), dpi=100)
> plt.contourf(x*1000,y*1000,phi1,50)
> cbar = plt.colorbar()
> savefig("wflow1368AC29SeptemberElectricField.png")
>  CHARGE  
> ld[221:end,:]=0.00017 #Debye lenght
> ld[1:220,:]=1.0#Debye lenght
> roc[218:219,57:97]=0.00750
> roc[218:219,153:193]=0.00750
> roc[218:219,249:289]=0.00750
> roc[218:219,345:385]=0.00750
> eps[1:220,:]=2.7
> eps[221:end,:]=1.0
>
> eps[221:221,:]=(eps[222:222,1:end].*eps[220:220,1:end])./(((dx/(2*dx))*eps[222:222,1:end])+
> (((dx/(2*dx))*eps[220:220,1:end])))
> roc[:,end] =0
> roc[end,:] = 0
> roc[:,1] =0
> roc[1,:] = 0
> #convergence parameters
> to1=1e-8
> ###Gauss-Seidel solver
> max_roc_diff=1
> while max_roc_diff > to1
>   roc_old=copy(roc)
>   
> roc[2:nx-1,2:ny-1]=((eps[2:nx-1,2:ny-1].*((roc[3:nx,2:ny-1]/(dx^2)).+(roc[2:nx-1,3:ny]/dy^2)))+
> ((eps[1:nx-2,2:ny-1].*roc[1:nx-2,2:ny-1])/dx^2)+
> 
> ((eps[2:nx-1,1:ny-2].*roc[2:nx-1,1:ny-2])/dy^2))./(((eps[2:nx-1,2:ny-1].+eps[1:nx-2,2:ny-1])/dx^2)+
> 
> ((eps[2:nx-1,2:ny-1]+eps[2:nx-1,1:ny-2])/dy^2)+(1./(ld[2:nx-1,2:ny-1].^2)))
>   ld[221:end,:]=0.00017 #charge
>   ld[1:220,:]=1.0#Debye lenght
>   roc[218:219,57:97]=0.00750
>   roc[218:219,153:193]=0.00750
>   roc[218:219,249:289]=0.00750
>   roc[218:219,345:385]=0.00750
>   roc_diff = roc.-roc_old
>   max_roc_diff=maximum(abs(roc_diff))
> end
>
> fig = plt.figure(figsize = (11,7), dpi=100)
> plt.contourf(x*1000,y*1000,roc,50)
> cbar = plt.colorbar()
> savefig("wflow1368AC29SeptemberCharge.png")
> # BODY FORCE ###
> F1=ones(ny,nx)
> Fx1=ones(ny,nx)
> Fy1=ones(ny,nx)
>
> F1[2:nx-1,2:ny-1]=roc[2:nx-1,2:ny-1].*(-(((phi1[2:nx-1,2:ny-1]-phi1[1:nx-2,2:ny-1])/dx)+((phi1[2:nx-1,2:ny-1]-phi1[2:nx-1,1:ny-2])/dy)))
> F1[221:222,49:57]=0.0
> F1[218:219,57:97]=0.0
> F1[221:222,97:105]=0.0
> F1[221:222,145:153]=0.0
> F1[218:219,153:193]=0.0
> F1[

[julia-users] Re: Documentation of operators

2015-09-24 Thread Michael Prentiss
I ran into this problem before with punctuation.
https://github.com/JuliaLang/julia/commit/b418a03529a9afec07c5aa032a9124b03cef912e#diff-91ec6806d45dd62d07012f6a018b151f

Maybe this should be addressed again.



On Wednesday, September 23, 2015 at 5:32:57 PM UTC-5, Alex Copeland wrote:
>
>
>
> Hi,
>
> Can someone point me to the documentation for '..'  and '->'  as in 
>  'include ..Sort'  and x -> x[2] . I've dug around in the source and in 
> readthedocs but patterns like this are the devil to search for unless they 
> have a text alias (that you happen to know). 
>
> Thanks,
> Alex
>


[julia-users] Re: ANN: Immerse package, and more videos on interactive plotting

2015-09-16 Thread Michael Prentiss
Thanks for the detailed answers.  They were very helpful.

This sort of thing where you have data -> calculated abstraction -> recover 
data, is so 
frequent that having a generic way of building this sort of tool will be 
hugely valuable.
Recovering data by text is useful, but there are times where graphics is 
necessary.
The  nature of my question is if you have the data and a calculated 
abstraction can you
hotwire your stuff to do something similar to what you are doing.  After 
your infrastructure 
work, it is just a matter of time until others add more interfaces.   This 
 is really exciting stuff.
Thanks



On Monday, September 14, 2015 at 10:15:04 AM UTC-5, Tim Holy wrote:
>
> I'm pleased to announce the availability of the Immerse package: 
> https://github.com/JuliaGraphics/Immerse.jl 
>
> Immerse is designed to add a new layer of interactivity to Gadfly plots. 
>
> I've also posted two tutorial videos: 
> https://www.youtube.com/watch?v=urM4SY92QTI 
> https://www.youtube.com/watch?v=2ic5vj9hD-w 
>
> The second video in particular has "the fun stuff," and shows some 
> features 
> I've long wished for in Julia: really good ways to go backwards from 
> plotted 
> objects to the data underlying each object, so that one can figure out 
> what's 
> going on in a complex, multidimensional data set. 
>
> It probably goes without saying, but Immerse is so new that you should 
> expect 
> a certain amount of API churn as it evolves. In particular, I've already 
> changed one of the `hit` callback arguments since I filmed that second 
> video 
> this morning :-). However, I personally find that it's already quite 
> functional 
> and darned handy; I encourage you to start making use of it and see 
> whether it 
> makes a difference in your work. 
>
> Despite my attempts to upload these videos to the JuliaLanguage youtube 
> channel, these seem to be going to my personal channel (which I didn't 
> even 
> have, before now...). If anyone has any tips about how to transfer them, 
> I'd 
> be grateful. 
>
> Best, 
> --Tim 
>
>

[julia-users] Re: ANN: Immerse package, and more videos on interactive plotting

2015-09-14 Thread Michael Prentiss
I was thinking the second option.  It would be great to be able to hotwire 
the gadfly part and read
in plots with pointers to previous rendered images.  I do not understand 
how difficult it would be 
to mimic gadfly output from other sources.  How tricky was that part of 
this work?  I would be 
great to make this work as generic as possible, because it so powerful.
Best,
Mike


[julia-users] Re: ANN: Immerse package, and more videos on interactive plotting

2015-09-14 Thread Michael Prentiss
+100  This has loads of possibilities. Great work.
I am guessing it would be possible, but how difficult would it 
be to read in a jpg, and explore it in a similar fashion.

On Monday, September 14, 2015 at 10:15:04 AM UTC-5, Tim Holy wrote:
>
> I'm pleased to announce the availability of the Immerse package: 
> https://github.com/JuliaGraphics/Immerse.jl 
>
> Immerse is designed to add a new layer of interactivity to Gadfly plots. 
>
> I've also posted two tutorial videos: 
> https://www.youtube.com/watch?v=urM4SY92QTI 
> https://www.youtube.com/watch?v=2ic5vj9hD-w 
>
> The second video in particular has "the fun stuff," and shows some 
> features 
> I've long wished for in Julia: really good ways to go backwards from 
> plotted 
> objects to the data underlying each object, so that one can figure out 
> what's 
> going on in a complex, multidimensional data set. 
>
> It probably goes without saying, but Immerse is so new that you should 
> expect 
> a certain amount of API churn as it evolves. In particular, I've already 
> changed one of the `hit` callback arguments since I filmed that second 
> video 
> this morning :-). However, I personally find that it's already quite 
> functional 
> and darned handy; I encourage you to start making use of it and see 
> whether it 
> makes a difference in your work. 
>
> Despite my attempts to upload these videos to the JuliaLanguage youtube 
> channel, these seem to be going to my personal channel (which I didn't 
> even 
> have, before now...). If anyone has any tips about how to transfer them, 
> I'd 
> be grateful. 
>
> Best, 
> --Tim 
>
>

[julia-users] Re: [ANN] ForwardDiff.jl v0.1.0 Released

2015-09-05 Thread Michael Prentiss
This looks very impressive.  I assume it works with multi-dimensional 
functions f(x,y,z)?

It also looks very fast.  What are the limitations to it?  Where would you 
still use analytic derivatives?


[julia-users] Re: ANN: Testing specific Julia versions on Travis CI

2015-07-30 Thread Michael Prentiss
That is great news.   Well done. 


[julia-users] Re: ANN: Testing specific Julia versions on Travis CI

2015-07-30 Thread Michael Prentiss
This is great progress. 

Similarly, is there a way for benchmarking on different versions of the 
code?
Automating this will be very helpful.

>
>

[julia-users] Re: ANN: Testing specific Julia versions on Travis CI

2015-07-30 Thread Michael Prentiss
This is great progress.  
Along these lines is there a way for doing bench marking against different 
versions of the code?

On Thursday, July 30, 2015 at 7:20:06 AM UTC-5, Tony Kelman wrote:
>
> Hey folks, an announcement for package authors and users who care about 
> testing:
>
> We've had support for Julia package testing on Travis CI 
>  for almost 9 months now, ref 
> https://groups.google.com/forum/#!msg/julia-users/BtCxh4k9hZA/ngUvxdxOxQ8J 
> if you missed the original announcement. Up to this point we supported the 
> following settings for which Julia version to test against:
>
> language: julia
> julia:
> - release
> - nightly
>
> Release has meant the latest release version in the 0.3.x series, and 
> nightly has meant the latest nightly build of 0.4-dev master. Once Julia 
> 0.4.0 gets released, the meaning of these settings will change, where 
> release will be the latest version in the 0.4.x series, and nightly will be 
> the latest nightly build of 0.5-dev master. Considering the wide install 
> base and number of packages that may want to continue supporting 0.3 even 
> after 0.4.0 gets released, we've just added support for additional version 
> options in your .travis.yml file. You can now do
>
> julia: 
> - release
> - nightly
> - 0.3
>
> Or, if you want to test with specific point releases, you can do that too 
> (there should not usually be much need for this, but it could be useful 
> once in a while to compare different point releases):
>
> julia: 
> - release
> - nightly
> - 0.3
> - 0.3.10
>
> The oldest point release for which we have generic Linux binaries 
> available is 0.3.1. If you enable multi-os support for your repository (see 
> http://docs.travis-ci.com/user/multi-os/), then you can go back as far as 
> 0.2.0 on OS X. Note that you'd need to replace the default test script with 
> the old-fashioned `julia test/runtests.jl` since `Pkg.test` and 
> `--check-bounds=yes` are not supported on Julia version 0.2.x. The 
> downloads of those versions would fail on Linux workers so you may need to 
> set up a build matrix with excluded jobs (see 
> http://docs.travis-ci.com/user/customizing-the-build/#Build-Matrix).
>
> Let us know if you have any questions or issues.
>
> Happy testing,
> Tony (with thanks to @ninjin and @staticfloat for PR review)
>
>

[julia-users] Re: Newbie help... First implementation of 3D heat equation solver VERY slow in Julia

2015-04-28 Thread Michael Prentiss
I implemented an program in Fortran and Julia for time comparison when 
learning the language.  
This was very helpful to find problems in how I was learning julia.  Maybe 
I did not read carefully enough,
but I would compile the fortran with the intel compilers (not MKL) instead 
of gcc as another means for 
comparing speed. The intel compilers tend to make faster executables.


On Saturday, April 25, 2015 at 10:21:25 AM UTC-5, Ángel de Vicente wrote:
>
> Hi,
>
> a complete Julia newbie here... I spent a couple of days learning the 
> syntax and main aspects of Julia, and since I heard many good things about 
> it, I decided to try a little program to see how it compares against the 
> other ones I regularly use: Fortran and Python.
>
> I wrote a minimal program to solve the 3D heat equation in a cube of 
> 100x100x100 points in the three languages and the time it takes to run in 
> each one is:
>
> Fortran: ~7s
> Python: ~33s
> Julia:~80s
>
> The code runs for 1000 iterations, and I'm being nice to Julia, since the 
> programs in Fortran and Python write 100 HDF5 files with the complete 100^3 
> data (every 10 iterations).
>
> I attach the code (and you can also get it at: 
> http://pastebin.com/y5HnbWQ1)
>
> Am I doing something obviously wrong? Any suggestions on how to improve 
> its speed?
>
> Thanks a lot,
> Ángel de Vicente
>
>

[julia-users] Re: Finding multiple local minima in Julia

2014-07-26 Thread Michael Prentiss
What you are doing makes sense.  Starting from multiple starting points is 
important.

I am curious why you just don't just run 20 different 1-processor jobs 
instead of bothering with the parallelism?


On Saturday, July 26, 2014 11:22:07 AM UTC-5, Iain Dunning wrote:
>
> The idea is to call the optimize function multiple times in parallel, not 
> to call it once and let it do parallel multistart.
>
> Check out the "parallel map and loops" section of the parallel programming 
> chapter in the Julia manual, I think it'll be clearer there.
>
> On Friday, July 25, 2014 8:00:40 PM UTC-4, Charles Martineau wrote:
>>
>> Thank you for your answer. So I would have to loop over, say 20 random 
>> set of starting points, where in my loop I would use the Optim package to 
>> minimize my MLE function for each random set. Where online is the documents 
>> that shows how to specify that we want the command 
>>
>> Optim.optimize(my function, etc.) to be parallelized? Sorry for my 
>> ignorance, I am new to Julia!
>>
>>
>> On Friday, July 25, 2014 2:04:08 PM UTC-7, Iain Dunning wrote:
>>>
>>> I'm not familiar with that particular package, but the Julia way to do 
>>> it could be to use the Optim.jl package and create a random set of starting 
>>> points, and do a parallel-map over that set of starting points. Should work 
>>> quite well. Trickier (maybe) would be to just give each processor a 
>>> different random seed and generate starting points on each processor.
>>>
>>> On Friday, July 25, 2014 3:05:05 PM UTC-4, Charles Martineau wrote:

 Dear Julia developers and users,

 I am currently using in Matlab the multisearch algorithm to find 
 multiple local minima: 
 http://www.mathworks.com/help/gads/multistart-class.html for a MLE 
 function.
 I use this Multisearch in a parallel setup as well.

 Can I do something similar in Julia using parallel programming?

 Thank you

 Charles



[julia-users] Re: Where to get the documentation for version 0.3?

2014-07-22 Thread Michael Prentiss
1.  Download the source, and unzip
2. Under julia/doc  its all there.   
3. Run make for the details



On Tuesday, July 22, 2014 10:56:56 PM UTC-5, K leo wrote:
>
>
>

[julia-users] Re: build-in function to find inverse of a matrix

2014-07-21 Thread Michael Prentiss
+1  

On Friday, July 18, 2014 3:40:14 PM UTC-5, Viral Shah wrote:
>
> I think that most new users are unlikely to know about apropos. Perhaps we 
> should put it in the julia banner.
>
> We can say something like:
> Type "help()" for function usage or "apropos()" to search the 
> documentation.
>
> apropos() could then just print a message about how to use it, just like 
> help() does.
>
> -viral
>
> On Wednesday, July 16, 2014 7:35:44 PM UTC+5:30, Hans W Borchers wrote:
>>
>> But  apropos  does find it:
>>
>> julia> apropos("matrix inverse")
>> INFO: Loading help data...
>> Base.inv(M)
>>
>>
>> On Wednesday, July 16, 2014 2:59:05 PM UTC+2, Tomas Lycken wrote:
>>>
>>> `inv` will do that for you. 
>>> http://docs.julialang.org/en/latest/stdlib/linalg/#Base.inv
>>>
>>> It's a little unfortunate that the help text is "Matrix inverse", yet 
>>> searching for that exact phrase yields no results at all. Is it possible to 
>>> make documentation search be full-text for help text as well?
>>>
>>> // T
>>>
>>> On Wednesday, July 16, 2014 2:39:53 PM UTC+2, Alan Chan wrote:

 sorry if it's a stupid question. But I cannot find one in the online 
 doc.

 Thanks

>>>

[julia-users] Re: 100 Julia exercises

2014-07-05 Thread Michael Prentiss
Julia is not as performant with anonymous functions, and list 
comprehension.   
The compiler has a harder time with the optimization step.  This is not a 
surprise and is
known to the language designers.  This is not a surprise.


On Saturday, July 5, 2014 8:01:46 PM UTC-5, james.dill...@gmail.com wrote:
>
>
> In Apprentice.6,  I don't think you want to use the sqrtm().  sqrt() is 
> already vectorized over the matrix.  Also a couple of '.'s are misplaced, 
> so perhaps instead:
>
> Z =rand(10,2)
>
> D = sqrt((Z[:,1].-Z[:,1]').^2+(Z[:,2].-Z[:,2]').^2)
>
> As a newbie myself, what surprised me is that this is faster and allocates 
> less memory than the comprehension:
>
> D =  [norm(Z[i,:]-Z[j,:],2) for i = 1:10, j = 1:10]
>
> I am sure someone else here can explain why.
>
> Jim
> On Sunday, June 22, 2014 10:43:32 AM UTC-4, Michiaki Ariga wrote:
>>
>> Hi all,
>>
>> I'm a Julia newbee, and I'm trying to learn Julia and wrote Julia version 
>> of rougier's 100 numpy exercises(
>> http://www.loria.fr/~rougier/teaching/numpy.100/index.html).
>>
>> https://github.com/chezou/julia-100-exercises
>>
>> I'd like you to tell me more "julia way" or something wrong with.
>>
>> Best regards,
>> Michiaki
>>
>