[julia-users] printf format for round-trip Floats

2016-09-01 Thread 'Greg Plowman' via julia-users
I'm trying to print Float64 with all significant digits, so that 
re-inputting these numbers from output strings will result in exactly 
the same number (is this called a round-trip?)

I understand that Julia uses a grisu algorithm to do this automatically for 
values displayed at REPL and with println().

I'm using @printf to output, so I tried  "%20.17g" as a format specifier 
and this seems to work.

But then I compared this to println() and some numbers vary in the last 
digit (e.g @printf displays 389557.48616130429, whereas println() displays 
389557.4861613043)
So it seems @printf(".17g") sometimes displays an extra unnecessary digit?

Is there another printf format specifer that I can use?

Alternatively, I'm using the following to first convert Float to string, 
and then pass string to @printf:

iobuffer = IOBuffer()
print(iobuffer, x)
str_hits = takebuf_string(iobuffer)
@printf("%20s", x)

but this seems a bit messy.



[julia-users] Re: Announcing TensorFlow.jl, an interface to Google's TensorFlow machine learning library

2016-09-01 Thread Vishnu Raj
Thanks Jon, I was planning to start learning Tensorflow - now I can do it 
in julia :)

On Thursday, September 1, 2016 at 4:01:58 AM UTC+5:30, Jonathan Malmaud 
wrote:
>
> Hello,
> I'm pleased to announce the release of TensorFlow.jl, enabling modern 
> GPU-accelerated deep learning for Julia. Simply run Pkg.add("TensorFlow") 
> to install and then read through the documentation at 
> https://malmaud.github.io/tfdocs/index.html to get started. Please file 
> any issues you encounter at https://github.com/malmaud/TensorFlow.jl. 
>
> TensorFlow.jl offers a convenient Julian interface to Google's TensorFlow 
> library. It includes functionality for building up a computation graph that 
> encodes a deep-learning model and automatically minimizing an arbitrary 
> loss function with respect to the model parameters. Support is included for 
> convolutional networks, recurrent networks with LSTMs, the Adam 
> optimization algorithm, loading images, and checkpointing model parameters 
> to disk during training
>
> I'm hopeful that this package will ensure Julia remain a first-class 
> citizen in world of modern machine learning and look forward to the 
> community's help in getting it to match or exceed the capabilities of the 
> official Python TensorFlow API. 
>
> -Jon
>


[julia-users] Re: Running Julia in Ubuntu

2016-09-01 Thread Josh Langsfeld
This link is only to an archive of the source code; you would still have to 
build julia after downloading this.

Ideally what you want is an ARM binary that's version 0.4 instead of a 
nightly build but I don't see anywhere obvious where that can be downloaded.

RobotOS will start working on 0.5 and up eventually, but you may still need 
to wait a few weeks.

On Thursday, September 1, 2016 at 7:52:09 PM UTC-4, Angshuman Goswami wrote:
>
> But there is no folder /bin/julia in the one I downloaded from 
> https://github.com/JuliaLang/julia/releases/tag/v0.4.6
>
> What should be the simlink when I try to build with this ??
>
> On Thursday, September 1, 2016 at 6:52:41 PM UTC-4, Kaj Wiik wrote:
>>
>> Hi!
>>
>> You symlink a wrong file, first 
>> sudo rm /usr/local/bin/julia.h
>>
>> The correct symlink line is
>> sudo ln -s /opt/julia-0.4.6/bin/julia  /usr/local/bin
>>
>> On Friday, September 2, 2016 at 1:11:07 AM UTC+3, Angshuman Goswami wrote:
>>>
>>> I have downloaded the Julia 0.4.6 from the repository: 
>>> https://github.com/JuliaLang/julia/releases/tag/v0.4.6
>>> I extracted the folder and copied to opt folder
>>> sudo ln -s /opt/julia-0.4.6/src/julia.h  /usr/local/bin
>>>
>>> I made the folder executable using sudo chmod +x *
>>>
>>> But I am getting the error:
>>> bash: julia: command not found
>>>
>>>
>>>
>>>
>>> On Thursday, September 1, 2016 at 5:38:10 PM UTC-4, Angshuman Goswami 
>>> wrote:

 I want to use Julia 0.4.6. Can you guide me through the process as if I 
 am a novice
 On Thursday, September 1, 2016 at 2:24:43 AM UTC-4, Lutfullah Tomak 
 wrote:
>
> You've already built julia I guess. You need to install python using 
> ubuntu's package system. In command prompt
> sudo apt-get install `pkg-name`
> will install the package you want to install by asking you your 
> password.
> For python
> sudo apt-get install python
> will install python. Close prompt and open julia and try again 
> building PyCall.jl by Pkg.build().
>
> On Wednesday, August 31, 2016 at 11:48:32 PM UTC+3, Angshuman Goswami 
> wrote:
>>
>> I don't get how to do that. 
>>
>> Can you please tell me the steps. Its all too confusing and I am very 
>> new to Ubuntu or Julia. Mostly used to work on Matlab. I have no idea 
>> how 
>> to install dependancies
>>
>> On Wednesday, August 31, 2016 at 3:26:40 AM UTC-4, Kaj Wiik wrote:
>>>
>>> Ah, sorry, I assumed you are using x86_64. Find the arm binary 
>>> tarball and follow the instructions otherwise. See
>>> https://github.com/JuliaLang/julia/blob/master/README.arm.md
>>>
>>>
>>> On Wednesday, August 31, 2016 at 9:54:38 AM UTC+3, Lutfullah Tomak 
>>> wrote:

 You are on an arm cpu so Conda cannot install python for you. Also, 
 you tried downloading x86 cpu linux binaries, instead try arm 
 nightlies.
 To get away with PyCall issues you have to manually install all 
 depencies. 

 On Wednesday, August 31, 2016 at 7:53:24 AM UTC+3, Angshuman 
 Goswami wrote:
>
> When i performed build again errors cropped up.
>
> Pkg.build("PyCall")
> WARNING: unable to determine host cpu name.
> INFO: Building PyCall
> INFO: No system-wide Python was found; got the following error:
> could not spawn `/usr/local/lib/python2.7 -c "import 
> distutils.sysconfig; 
> print(distutils.sysconfig.get_config_var('VERSION'))"`: permission 
> denied 
> (EACCES)
> using the Python distribution in the Conda package
> INFO: Downloading miniconda installer ...
>   % Total% Received % Xferd  Average Speed   TimeTime 
> Time  Current
>  Dload  Upload   Total   Spent
> Left  Speed
> 100 24.7M  100 24.7M0 0  2401k  0  0:00:10  0:00:10 
> --:--:-- 2743k
> INFO: Installing miniconda ...
> PREFIX=/home/odroid/.julia/v0.4/Conda/deps/usr
> installing: _cache-0.0-py27_x0 ...
> installing: python-2.7.11-0 ...
> installing: conda-env-2.4.5-py27_0 ...
> installing: openssl-1.0.2g-0 ...
> installing: pycosat-0.6.1-py27_0 ...
> installing: pyyaml-3.11-py27_1 ...
> installing: readline-6.2-2 ...
> installing: requests-2.9.1-py27_0 ...
> installing: sqlite-3.9.2-0 ...
> installing: tk-8.5.18-0 ...
> installing: yaml-0.1.6-0 ...
> installing: zlib-1.2.8-0 ...
> installing: conda-4.0.5-py27_0 ...
> installing: pycrypto-2.6.1-py27_0 ...
> installing: pip-8.1.1-py27_1 ...
> installing: wheel-0.29.0-py27_0 ...
> installing: setuptools-20.3-py27_0 ...
> /home/odroid/.julia/v0.4/Conda/deps/usr/installer.sh: line 288: 
> /home/odroid/.julia/v0.4/Conda

[julia-users] Re: Running Julia in Ubuntu

2016-09-01 Thread Angshuman Goswami
I am running on a ARM and I think the latest release 0.4.6 has problems 
with even the 32 bit (x84) 

Can anyone help?

On Thursday, September 1, 2016 at 7:52:09 PM UTC-4, Angshuman Goswami wrote:
>
> But there is no folder /bin/julia in the one I downloaded from 
> https://github.com/JuliaLang/julia/releases/tag/v0.4.6
>
> What should be the simlink when I try to build with this ??
>
> On Thursday, September 1, 2016 at 6:52:41 PM UTC-4, Kaj Wiik wrote:
>>
>> Hi!
>>
>> You symlink a wrong file, first 
>> sudo rm /usr/local/bin/julia.h
>>
>> The correct symlink line is
>> sudo ln -s /opt/julia-0.4.6/bin/julia  /usr/local/bin
>>
>> On Friday, September 2, 2016 at 1:11:07 AM UTC+3, Angshuman Goswami wrote:
>>>
>>> I have downloaded the Julia 0.4.6 from the repository: 
>>> https://github.com/JuliaLang/julia/releases/tag/v0.4.6
>>> I extracted the folder and copied to opt folder
>>> sudo ln -s /opt/julia-0.4.6/src/julia.h  /usr/local/bin
>>>
>>> I made the folder executable using sudo chmod +x *
>>>
>>> But I am getting the error:
>>> bash: julia: command not found
>>>
>>>
>>>
>>>
>>> On Thursday, September 1, 2016 at 5:38:10 PM UTC-4, Angshuman Goswami 
>>> wrote:

 I want to use Julia 0.4.6. Can you guide me through the process as if I 
 am a novice
 On Thursday, September 1, 2016 at 2:24:43 AM UTC-4, Lutfullah Tomak 
 wrote:
>
> You've already built julia I guess. You need to install python using 
> ubuntu's package system. In command prompt
> sudo apt-get install `pkg-name`
> will install the package you want to install by asking you your 
> password.
> For python
> sudo apt-get install python
> will install python. Close prompt and open julia and try again 
> building PyCall.jl by Pkg.build().
>
> On Wednesday, August 31, 2016 at 11:48:32 PM UTC+3, Angshuman Goswami 
> wrote:
>>
>> I don't get how to do that. 
>>
>> Can you please tell me the steps. Its all too confusing and I am very 
>> new to Ubuntu or Julia. Mostly used to work on Matlab. I have no idea 
>> how 
>> to install dependancies
>>
>> On Wednesday, August 31, 2016 at 3:26:40 AM UTC-4, Kaj Wiik wrote:
>>>
>>> Ah, sorry, I assumed you are using x86_64. Find the arm binary 
>>> tarball and follow the instructions otherwise. See
>>> https://github.com/JuliaLang/julia/blob/master/README.arm.md
>>>
>>>
>>> On Wednesday, August 31, 2016 at 9:54:38 AM UTC+3, Lutfullah Tomak 
>>> wrote:

 You are on an arm cpu so Conda cannot install python for you. Also, 
 you tried downloading x86 cpu linux binaries, instead try arm 
 nightlies.
 To get away with PyCall issues you have to manually install all 
 depencies. 

 On Wednesday, August 31, 2016 at 7:53:24 AM UTC+3, Angshuman 
 Goswami wrote:
>
> When i performed build again errors cropped up.
>
> Pkg.build("PyCall")
> WARNING: unable to determine host cpu name.
> INFO: Building PyCall
> INFO: No system-wide Python was found; got the following error:
> could not spawn `/usr/local/lib/python2.7 -c "import 
> distutils.sysconfig; 
> print(distutils.sysconfig.get_config_var('VERSION'))"`: permission 
> denied 
> (EACCES)
> using the Python distribution in the Conda package
> INFO: Downloading miniconda installer ...
>   % Total% Received % Xferd  Average Speed   TimeTime 
> Time  Current
>  Dload  Upload   Total   Spent
> Left  Speed
> 100 24.7M  100 24.7M0 0  2401k  0  0:00:10  0:00:10 
> --:--:-- 2743k
> INFO: Installing miniconda ...
> PREFIX=/home/odroid/.julia/v0.4/Conda/deps/usr
> installing: _cache-0.0-py27_x0 ...
> installing: python-2.7.11-0 ...
> installing: conda-env-2.4.5-py27_0 ...
> installing: openssl-1.0.2g-0 ...
> installing: pycosat-0.6.1-py27_0 ...
> installing: pyyaml-3.11-py27_1 ...
> installing: readline-6.2-2 ...
> installing: requests-2.9.1-py27_0 ...
> installing: sqlite-3.9.2-0 ...
> installing: tk-8.5.18-0 ...
> installing: yaml-0.1.6-0 ...
> installing: zlib-1.2.8-0 ...
> installing: conda-4.0.5-py27_0 ...
> installing: pycrypto-2.6.1-py27_0 ...
> installing: pip-8.1.1-py27_1 ...
> installing: wheel-0.29.0-py27_0 ...
> installing: setuptools-20.3-py27_0 ...
> /home/odroid/.julia/v0.4/Conda/deps/usr/installer.sh: line 288: 
> /home/odroid/.julia/v0.4/Conda/deps/usr/pkgs/python-2.7.11-0/bin/python:
>  
> cannot execute binary file: Exec format error
> ERROR:
> cannot execute native linux-32 binary, output from 'uname -a' is:
> Linux odroid 3.10.69 #1 SMP P

Re: [julia-users] Re: Juila vs Mathematica (Wolfram language): high-level features

2016-09-01 Thread Erik Schnetter
One of the strengths of Mathematica's pattern matching system is the
ability for structural matching, e.g. to match a list where the second
element is a real number. This is something that is not (yet?) possible
with Julia's dispatch. It would be nice to be able to write something like

```Julia
function f(x::Expr where x.args::Tuple{T, Real, ...})
```

(with a more concise syntax) to mimic this.

-erik


On Thu, Sep 1, 2016 at 7:10 PM, Chris Rackauckas  wrote:

> I think types+dispatch is the right way to go. It's what Julia is founded
> upon, and it's what leads to really fast computations. A fast type/dispatch
> based symbolic system for Julia and in pure Julia would be a huge asset.
> And while the post said not to mention the front end, when talking about
> Mathematica you have to talk about the front end. The only reason why I use
> it over other CAS' is because that notebook looks like math. Until someone
> implements something like that in Julia, I will always have a reason to
> open up Mathematica.
>
>
> On Thursday, September 1, 2016 at 3:22:12 PM UTC-7, lapeyre@gmail.com
> wrote:
>>
>>
>>
>> On Thursday, January 23, 2014 at 11:47:11 PM UTC+1, Акатер Дима wrote:
>>>
>>> It's mentioned here http://julialang.org/blog/2012
>>> /02/why-we-created-julia/ that Mathematica was one of the programs that
>>> inspired Julia. How does Julia compare to Mathematica's language?
>>>
>>> To make the question more specific,
>>>
>>> - it's about languages, not implementations, so Mathematica's FrontEnd
>>> capabilities are irrelevant (and it's also not about “which one is faster”)
>>> - it's not about free vs non-free
>>> - it's not about communities and support
>>> - it's not about anything related to OOP (feel free to write about,
>>> sure, but I will probably be very passive in discussions concerning OOP)
>>>
>>> - it's about the languages' most high-level features, like
>>> meta-programming, macros, interactions with other languages and the way it
>>> treats types
>>>
>>> For example, Wolfram language (which is now the name of Mma's core
>>> language), implements meta-programming capabilities via term-rewriting and
>>> Hold. It provides some foreign-language interfaces via MathLink (which
>>> could be WolframLink already) and also has SymbolicC. It is untyped and
>>> proud of it; types can be implemented easily but they are of little
>>> practical need in the absence of compiler.
>>>
>>>
>> - it's also about the languages' most distinctive features: are there
>>> things Julia has that WL does not have? (Which means adding them to WL
>>> would require reimplementing Julia in WL, much in spirit of Greenspun's
>>> tenth rule.)
>>>
>>> I think it is easier to go the other direction and implement
>> something like Mma in Julia. This is what I have done here:
>>
>> https://github.com/jlapeyre/SJulia.jl
>>
>> I think Julia is uniquely well suited for implementing a Mma-like
>> language.  I agree with the comment below that Mma is designed in part to
>> appeal to non-programmers. A large part of its appeal is that it collects a
>> lot of mathematics functionality that is hard to find elsewhere... all
>> kinds of algorithms and special functions. Many of these can be used with
>> one or a few lines of code. I kept the non-programmer in mind when writing
>> SJulia.
>>
>>The question of what kind of type system a language has is somewhat
>> polemic. In some sense, Mma is untyped. There is no hierarchy in
>> expressions; they all have a head and arguments. I think hierarchies of
>> mathematical objects are not well represented by hierarchies of programming
>> language types. Which hierarchy a particular mathematical object belongs to
>> and its exact definition is very fluid. Languages like Mma that attach no
>> inherent meaning to expressions are well suited for mathematics for
>> scientists and engineers. A matrix is an expression with head 'List' each
>> of whose elements is an expression of fixed length with head 'List'.  Still
>> types creep into Mma in various ways.
>>
>> Some people prefer types to play a larger role in symbolic computation.
>> For instance:
>>
>> http://www.sympy.org/en/index.html
>>
>> https://github.com/jverzani/SymPy.jl
>>
>> http://nemocas.org/
>>
>> Whether to use types depends in part on the domain of the language.  But
>> even for rather general math capabilities, language design determines in
>> part the role of types. Sympy (in python and Julia) aim to add symbolic
>> computation capability to Julia.  They are more 'typed' than Mma and
>> SJulia. But, it seems that python sympy is hybrid in this respect and also
>> supports generic expressions.
>>
>>
>>> To provide a starting point, here is the definition of type in Julia
>>> from documentation http://docs.julialang.org/en/l
>>> atest/manual/metaprogramming/
>>>
>>> type Expr
>>>   head::Symbol
>>>   args::Array{Any,1}
>>>   typend
>>>
>>>
>>> Maybe there's a typo in docs (line 4) but it doesn't really matter. What
>>> do J

[julia-users] Re: Running Julia in Ubuntu

2016-09-01 Thread Angshuman Goswami
But there is no folder /bin/julia in the one I downloaded from 
https://github.com/JuliaLang/julia/releases/tag/v0.4.6

What should be the simlink when I try to build with this ??

On Thursday, September 1, 2016 at 6:52:41 PM UTC-4, Kaj Wiik wrote:
>
> Hi!
>
> You symlink a wrong file, first 
> sudo rm /usr/local/bin/julia.h
>
> The correct symlink line is
> sudo ln -s /opt/julia-0.4.6/bin/julia  /usr/local/bin
>
> On Friday, September 2, 2016 at 1:11:07 AM UTC+3, Angshuman Goswami wrote:
>>
>> I have downloaded the Julia 0.4.6 from the repository: 
>> https://github.com/JuliaLang/julia/releases/tag/v0.4.6
>> I extracted the folder and copied to opt folder
>> sudo ln -s /opt/julia-0.4.6/src/julia.h  /usr/local/bin
>>
>> I made the folder executable using sudo chmod +x *
>>
>> But I am getting the error:
>> bash: julia: command not found
>>
>>
>>
>>
>> On Thursday, September 1, 2016 at 5:38:10 PM UTC-4, Angshuman Goswami 
>> wrote:
>>>
>>> I want to use Julia 0.4.6. Can you guide me through the process as if I 
>>> am a novice
>>> On Thursday, September 1, 2016 at 2:24:43 AM UTC-4, Lutfullah Tomak 
>>> wrote:

 You've already built julia I guess. You need to install python using 
 ubuntu's package system. In command prompt
 sudo apt-get install `pkg-name`
 will install the package you want to install by asking you your 
 password.
 For python
 sudo apt-get install python
 will install python. Close prompt and open julia and try again building 
 PyCall.jl by Pkg.build().

 On Wednesday, August 31, 2016 at 11:48:32 PM UTC+3, Angshuman Goswami 
 wrote:
>
> I don't get how to do that. 
>
> Can you please tell me the steps. Its all too confusing and I am very 
> new to Ubuntu or Julia. Mostly used to work on Matlab. I have no idea how 
> to install dependancies
>
> On Wednesday, August 31, 2016 at 3:26:40 AM UTC-4, Kaj Wiik wrote:
>>
>> Ah, sorry, I assumed you are using x86_64. Find the arm binary 
>> tarball and follow the instructions otherwise. See
>> https://github.com/JuliaLang/julia/blob/master/README.arm.md
>>
>>
>> On Wednesday, August 31, 2016 at 9:54:38 AM UTC+3, Lutfullah Tomak 
>> wrote:
>>>
>>> You are on an arm cpu so Conda cannot install python for you. Also, 
>>> you tried downloading x86 cpu linux binaries, instead try arm nightlies.
>>> To get away with PyCall issues you have to manually install all 
>>> depencies. 
>>>
>>> On Wednesday, August 31, 2016 at 7:53:24 AM UTC+3, Angshuman Goswami 
>>> wrote:

 When i performed build again errors cropped up.

 Pkg.build("PyCall")
 WARNING: unable to determine host cpu name.
 INFO: Building PyCall
 INFO: No system-wide Python was found; got the following error:
 could not spawn `/usr/local/lib/python2.7 -c "import 
 distutils.sysconfig; 
 print(distutils.sysconfig.get_config_var('VERSION'))"`: permission 
 denied 
 (EACCES)
 using the Python distribution in the Conda package
 INFO: Downloading miniconda installer ...
   % Total% Received % Xferd  Average Speed   TimeTime 
 Time  Current
  Dload  Upload   Total   Spent
 Left  Speed
 100 24.7M  100 24.7M0 0  2401k  0  0:00:10  0:00:10 
 --:--:-- 2743k
 INFO: Installing miniconda ...
 PREFIX=/home/odroid/.julia/v0.4/Conda/deps/usr
 installing: _cache-0.0-py27_x0 ...
 installing: python-2.7.11-0 ...
 installing: conda-env-2.4.5-py27_0 ...
 installing: openssl-1.0.2g-0 ...
 installing: pycosat-0.6.1-py27_0 ...
 installing: pyyaml-3.11-py27_1 ...
 installing: readline-6.2-2 ...
 installing: requests-2.9.1-py27_0 ...
 installing: sqlite-3.9.2-0 ...
 installing: tk-8.5.18-0 ...
 installing: yaml-0.1.6-0 ...
 installing: zlib-1.2.8-0 ...
 installing: conda-4.0.5-py27_0 ...
 installing: pycrypto-2.6.1-py27_0 ...
 installing: pip-8.1.1-py27_1 ...
 installing: wheel-0.29.0-py27_0 ...
 installing: setuptools-20.3-py27_0 ...
 /home/odroid/.julia/v0.4/Conda/deps/usr/installer.sh: line 288: 
 /home/odroid/.julia/v0.4/Conda/deps/usr/pkgs/python-2.7.11-0/bin/python:
  
 cannot execute binary file: Exec format error
 ERROR:
 cannot execute native linux-32 binary, output from 'uname -a' is:
 Linux odroid 3.10.69 #1 SMP PREEMPT Thu Feb 12 15:22:14 BRST 2015 
 armv7l armv7l armv7l GNU/Linux
 ===[ ERROR: PyCall 
 ]

 LoadError: failed process: 
 Process(`/home/odroid/.julia/v0.4/Conda/deps/usr/installer.sh -b -f -p 
 /home/odroid/.julia/v0.4/Co

[julia-users] Re: Juila vs Mathematica (Wolfram language): high-level features

2016-09-01 Thread Chris Rackauckas
I think types+dispatch is the right way to go. It's what Julia is founded 
upon, and it's what leads to really fast computations. A fast type/dispatch 
based symbolic system for Julia and in pure Julia would be a huge asset. 
And while the post said not to mention the front end, when talking about 
Mathematica you have to talk about the front end. The only reason why I use 
it over other CAS' is because that notebook looks like math. Until someone 
implements something like that in Julia, I will always have a reason to 
open up Mathematica.

On Thursday, September 1, 2016 at 3:22:12 PM UTC-7, lapeyre@gmail.com 
wrote:
>
>
>
> On Thursday, January 23, 2014 at 11:47:11 PM UTC+1, Акатер Дима wrote:
>>
>> It's mentioned here 
>> http://julialang.org/blog/2012/02/why-we-created-julia/ that Mathematica 
>> was one of the programs that inspired Julia. How does Julia compare to 
>> Mathematica's language?
>>
>> To make the question more specific,
>>
>> - it's about languages, not implementations, so Mathematica's FrontEnd 
>> capabilities are irrelevant (and it's also not about “which one is faster”)
>> - it's not about free vs non-free
>> - it's not about communities and support
>> - it's not about anything related to OOP (feel free to write about, sure, 
>> but I will probably be very passive in discussions concerning OOP)
>>
>> - it's about the languages' most high-level features, like 
>> meta-programming, macros, interactions with other languages and the way it 
>> treats types
>>
>> For example, Wolfram language (which is now the name of Mma's core 
>> language), implements meta-programming capabilities via term-rewriting and 
>> Hold. It provides some foreign-language interfaces via MathLink (which 
>> could be WolframLink already) and also has SymbolicC. It is untyped and 
>> proud of it; types can be implemented easily but they are of little 
>> practical need in the absence of compiler.
>>  
>>
> - it's also about the languages' most distinctive features: are there 
>> things Julia has that WL does not have? (Which means adding them to WL 
>> would require reimplementing Julia in WL, much in spirit of Greenspun's 
>> tenth rule.)
>>
>> I think it is easier to go the other direction and implement 
> something like Mma in Julia. This is what I have done here:
>
> https://github.com/jlapeyre/SJulia.jl
>
> I think Julia is uniquely well suited for implementing a Mma-like 
> language.  I agree with the comment below that Mma is designed in part to 
> appeal to non-programmers. A large part of its appeal is that it collects a 
> lot of mathematics functionality that is hard to find elsewhere... all 
> kinds of algorithms and special functions. Many of these can be used with 
> one or a few lines of code. I kept the non-programmer in mind when writing 
> SJulia.
>
>The question of what kind of type system a language has is somewhat 
> polemic. In some sense, Mma is untyped. There is no hierarchy in 
> expressions; they all have a head and arguments. I think hierarchies of 
> mathematical objects are not well represented by hierarchies of programming 
> language types. Which hierarchy a particular mathematical object belongs to 
> and its exact definition is very fluid. Languages like Mma that attach no 
> inherent meaning to expressions are well suited for mathematics for 
> scientists and engineers. A matrix is an expression with head 'List' each 
> of whose elements is an expression of fixed length with head 'List'.  Still 
> types creep into Mma in various ways. 
>
> Some people prefer types to play a larger role in symbolic computation. 
> For instance:
>
> http://www.sympy.org/en/index.html
>
> https://github.com/jverzani/SymPy.jl
>
> http://nemocas.org/
>
> Whether to use types depends in part on the domain of the language.  But 
> even for rather general math capabilities, language design determines in 
> part the role of types. Sympy (in python and Julia) aim to add symbolic 
> computation capability to Julia.  They are more 'typed' than Mma and 
> SJulia. But, it seems that python sympy is hybrid in this respect and also 
> supports generic expressions.
>  
>
>> To provide a starting point, here is the definition of type in Julia from 
>> documentation http://docs.julialang.org/en/latest/manual/metaprogramming/
>>
>> type Expr
>>   head::Symbol
>>   args::Array{Any,1}
>>   typend
>>
>>
>> Maybe there's a typo in docs (line 4) but it doesn't really matter. What 
>> do Julia users do, for example, to avoid boilerplate code defining lots of 
>> types? I understand how to avoid writing boilerplate definitions in WL: it 
>> may not be easy but at least I know where to begin in case I need a program 
>> that would write new definitions or update existing ones.
>>
>

[julia-users] Re: Running Julia in Ubuntu

2016-09-01 Thread Kaj Wiik
Hi!

You symlink a wrong file, first 
sudo rm /usr/local/bin/julia.h

The correct symlink line is
sudo ln -s /opt/julia-0.4.6/bin/julia  /usr/local/bin

On Friday, September 2, 2016 at 1:11:07 AM UTC+3, Angshuman Goswami wrote:
>
> I have downloaded the Julia 0.4.6 from the repository: 
> https://github.com/JuliaLang/julia/releases/tag/v0.4.6
> I extracted the folder and copied to opt folder
> sudo ln -s /opt/julia-0.4.6/src/julia.h  /usr/local/bin
>
> I made the folder executable using sudo chmod +x *
>
> But I am getting the error:
> bash: julia: command not found
>
>
>
>
> On Thursday, September 1, 2016 at 5:38:10 PM UTC-4, Angshuman Goswami 
> wrote:
>>
>> I want to use Julia 0.4.6. Can you guide me through the process as if I 
>> am a novice
>> On Thursday, September 1, 2016 at 2:24:43 AM UTC-4, Lutfullah Tomak wrote:
>>>
>>> You've already built julia I guess. You need to install python using 
>>> ubuntu's package system. In command prompt
>>> sudo apt-get install `pkg-name`
>>> will install the package you want to install by asking you your password.
>>> For python
>>> sudo apt-get install python
>>> will install python. Close prompt and open julia and try again building 
>>> PyCall.jl by Pkg.build().
>>>
>>> On Wednesday, August 31, 2016 at 11:48:32 PM UTC+3, Angshuman Goswami 
>>> wrote:

 I don't get how to do that. 

 Can you please tell me the steps. Its all too confusing and I am very 
 new to Ubuntu or Julia. Mostly used to work on Matlab. I have no idea how 
 to install dependancies

 On Wednesday, August 31, 2016 at 3:26:40 AM UTC-4, Kaj Wiik wrote:
>
> Ah, sorry, I assumed you are using x86_64. Find the arm binary tarball 
> and follow the instructions otherwise. See
> https://github.com/JuliaLang/julia/blob/master/README.arm.md
>
>
> On Wednesday, August 31, 2016 at 9:54:38 AM UTC+3, Lutfullah Tomak 
> wrote:
>>
>> You are on an arm cpu so Conda cannot install python for you. Also, 
>> you tried downloading x86 cpu linux binaries, instead try arm nightlies.
>> To get away with PyCall issues you have to manually install all 
>> depencies. 
>>
>> On Wednesday, August 31, 2016 at 7:53:24 AM UTC+3, Angshuman Goswami 
>> wrote:
>>>
>>> When i performed build again errors cropped up.
>>>
>>> Pkg.build("PyCall")
>>> WARNING: unable to determine host cpu name.
>>> INFO: Building PyCall
>>> INFO: No system-wide Python was found; got the following error:
>>> could not spawn `/usr/local/lib/python2.7 -c "import 
>>> distutils.sysconfig; 
>>> print(distutils.sysconfig.get_config_var('VERSION'))"`: permission 
>>> denied 
>>> (EACCES)
>>> using the Python distribution in the Conda package
>>> INFO: Downloading miniconda installer ...
>>>   % Total% Received % Xferd  Average Speed   TimeTime 
>>> Time  Current
>>>  Dload  Upload   Total   Spent
>>> Left  Speed
>>> 100 24.7M  100 24.7M0 0  2401k  0  0:00:10  0:00:10 
>>> --:--:-- 2743k
>>> INFO: Installing miniconda ...
>>> PREFIX=/home/odroid/.julia/v0.4/Conda/deps/usr
>>> installing: _cache-0.0-py27_x0 ...
>>> installing: python-2.7.11-0 ...
>>> installing: conda-env-2.4.5-py27_0 ...
>>> installing: openssl-1.0.2g-0 ...
>>> installing: pycosat-0.6.1-py27_0 ...
>>> installing: pyyaml-3.11-py27_1 ...
>>> installing: readline-6.2-2 ...
>>> installing: requests-2.9.1-py27_0 ...
>>> installing: sqlite-3.9.2-0 ...
>>> installing: tk-8.5.18-0 ...
>>> installing: yaml-0.1.6-0 ...
>>> installing: zlib-1.2.8-0 ...
>>> installing: conda-4.0.5-py27_0 ...
>>> installing: pycrypto-2.6.1-py27_0 ...
>>> installing: pip-8.1.1-py27_1 ...
>>> installing: wheel-0.29.0-py27_0 ...
>>> installing: setuptools-20.3-py27_0 ...
>>> /home/odroid/.julia/v0.4/Conda/deps/usr/installer.sh: line 288: 
>>> /home/odroid/.julia/v0.4/Conda/deps/usr/pkgs/python-2.7.11-0/bin/python:
>>>  
>>> cannot execute binary file: Exec format error
>>> ERROR:
>>> cannot execute native linux-32 binary, output from 'uname -a' is:
>>> Linux odroid 3.10.69 #1 SMP PREEMPT Thu Feb 12 15:22:14 BRST 2015 
>>> armv7l armv7l armv7l GNU/Linux
>>> ===[ ERROR: PyCall 
>>> ]
>>>
>>> LoadError: failed process: 
>>> Process(`/home/odroid/.julia/v0.4/Conda/deps/usr/installer.sh -b -f -p 
>>> /home/odroid/.julia/v0.4/Conda/deps/usr`, ProcessExited(1)) [1]
>>> while loading /home/odroid/.julia/v0.4/PyCall/deps/build.jl, in 
>>> expression starting on line 17
>>>
>>>
>>> 
>>>
>>> [ BUILD ERRORS 
>>> ]
>>>
>>> WARNING: P

Re: [julia-users] IBM Power port

2016-09-01 Thread James Fairbanks
Hi Viral, 

I got negative results on my power8 machine. 
After untarring the link above I got the following errors when just running 
the repl.


[jpf@power8 julia-3005940a21]$ ./bin/julia 
'powerpc64le' is not a recognized processor for this target (ignoring 
processor)
'powerpc64le' is not a recognized processor for this target (ignoring 
processor)
'powerpc64le' is not a recognized processor for this target (ignoring 
processor)
'powerpc64le' is not a recognized processor for this target (ignoring 
processor)
'powerpc64le' is not a recognized processor for this target (ignoring 
processor)
'powerpc64le' is not a recognized processor for this target (ignoring 
processor)
'powerpc64le' is not a recognized processor for this target (ignoring 
processor)
'powerpc64le' is not a recognized processor for this target (ignoring 
processor)
'powerpc64le' is not a recognized processor for this target (ignoring 
processor)
'powerpc64le' is not a recognized processor for this target (ignoring 
processor)
'powerpc64le' is not a recognized processor for this target (ignoring 
processor)
'powerpc64le' is not a recognized processor for this target (ignoring 
processor)
   _
   _   _ _(_)_ |  A fresh approach to technical computing
  (_) | (_) (_)|  Documentation: http://docs.julialang.org
   _ _   _| |_  __ _   |  Type "?help" for help.
  | | | | | | |/ _` |  |
  | | |_| | | | (_| |  |  Version 0.5.0-rc3+3 (2016-08-26 06:19 UTC)
 _/ |\__'_|_|_|\__'_|  |  sf/ppc64le/3005940 (fork: 3 commits, 9 days)
|__/   |  powerpc64le-unknown-linux-gnu


Then I got a lot more errors. 
For example: 

WARNING: Method definition f(Tuple{Vararg{Int64, #N<:Any}}, 
AbstractArray{#T<:Any, #N<:Any}) in module Main at 
/home/jpf/julia-3005940a21/share/julia/test/core.jl:706 overwritten at 
/home/jpf/julia-3005940a21/share/julia/test/core.jl:712.
From worker 13: * linalg/diagonal   in 101.62 seconds, 
maxrss  348.31 MB
From worker 13: * inference in   0.92 seconds, 
maxrss  352.38 MB
From worker 13: * keywordargs   in   1.56 seconds, 
maxrss  354.13 MB
WARNING: Method definition f() in module JLCall14301 at 
/home/jpf/julia-3005940a21/share/julia/test/core.jl:3529 overwritten at 
/home/jpf/julia-3005940a21/share/julia/test/core.jl:3539.
From worker 5: * linalg/matmul in 169.17 seconds, 
maxrss  367.69 MB
From worker 16: * linalg/cholesky   in  96.71 seconds, 
maxrss  332.44 MB
From worker 16: * char Error During Test
From worker 16:  Test threw an exception of type InexactError
From worker 16:  Expression: $(Expr(:escape, 
:(convert(Char,Float16(x) $(Expr(:escape, :(==))) $(Expr(:escape, 
:(convert(Char,Float32(x) $(Expr(:escape, :(==))) $(Expr(:escape, 
:(convert(Char,Float64(x) $(Expr(:escape, :(==))) $(Expr(:escape, 
:(Char(x
From worker 16:  InexactError()
From worker 16:   in macro expansion; at 
/home/jpf/julia-3005940a21/share/julia/test/char.jl:72 [inlined]
From worker 16:   in anonymous at ./:?
From worker 16:   in include_string(::String, ::String) at 
./loading.jl:380
From worker 16:   in include_from_node1(::String) at 
./loading.jl:429
From worker 16:   in macro expansion at ./util.jl:226 [inlined]
From worker 16:   in runtests(::String) at 
/home/jpf/julia-3005940a21/share/julia/test/testdefs.jl:7
From worker 16:   in 
(::Base.Serializer.__deserialized_types__.##16#24)(::String) at 
/home/jpf/julia-3005940a21/share/julia/test/runtests.jl:44
From worker 16:   in 
(::Base.##625#627{Base.CallMsg{:call_fetch}})() at ./multi.jl:1421
From worker 16:   in 
run_work_thunk(::Base.##625#627{Base.CallMsg{:call_fetch}}, ::Bool) at 
./multi.jl:1001
From worker 16:   in macro expansion at ./multi.jl:1421 [inlined]
From worker 16:   in 
(::Base.##624#626{Base.CallMsg{:call_fetch},Base.MsgHeader,TCPSocket})() at 
./event.jl:68

On Tuesday, August 30, 2016 at 1:30:54 AM UTC-4, Viral Shah wrote:
>
> I should point out that the linalg tests are expected to fail for now, 
> since we are awaiting a new openblas release, which is known to fix these 
> issues.
>
> -viral
>
> On Friday, August 19, 2016 at 10:26:38 AM UTC+5:30, Viral Shah wrote:
>>
>> I have uploaded Julia-0.5 on Power8 binaries here. These are built with 
>> the latest openblas (that passes all julia tests) and hence there is no 
>> need to use ATLAS. 
>>
>> https://drive.google.com/open?id=0B0rXlkvSbIfhVWpZb2hqclBIVms 
>>
>> Would be great if people can try this out. 
>>
>> -viral 
>>
>>
>>
>> > On Aug 19, 2016, at 9:06 AM, Viral Shah > 
>> wrote: 
>> > 
>> > I am getting successful builds on the OSU Power8 machine. Once openblas 
>> has a new release, I suspect we can provide pre-packaged power8 binaries. 
>> > 
>> > I am building on CentOS 7 and this is what lscpu says: 
>> > 
>> > Architectur

[julia-users] Re: Juila vs Mathematica (Wolfram language): high-level features

2016-09-01 Thread lapeyre . math122a


On Thursday, January 23, 2014 at 11:47:11 PM UTC+1, Акатер Дима wrote:
>
> It's mentioned here 
> http://julialang.org/blog/2012/02/why-we-created-julia/ that Mathematica 
> was one of the programs that inspired Julia. How does Julia compare to 
> Mathematica's language?
>
> To make the question more specific,
>
> - it's about languages, not implementations, so Mathematica's FrontEnd 
> capabilities are irrelevant (and it's also not about “which one is faster”)
> - it's not about free vs non-free
> - it's not about communities and support
> - it's not about anything related to OOP (feel free to write about, sure, 
> but I will probably be very passive in discussions concerning OOP)
>
> - it's about the languages' most high-level features, like 
> meta-programming, macros, interactions with other languages and the way it 
> treats types
>
> For example, Wolfram language (which is now the name of Mma's core 
> language), implements meta-programming capabilities via term-rewriting and 
> Hold. It provides some foreign-language interfaces via MathLink (which 
> could be WolframLink already) and also has SymbolicC. It is untyped and 
> proud of it; types can be implemented easily but they are of little 
> practical need in the absence of compiler.
>  
>
- it's also about the languages' most distinctive features: are there 
> things Julia has that WL does not have? (Which means adding them to WL 
> would require reimplementing Julia in WL, much in spirit of Greenspun's 
> tenth rule.)
>
> I think it is easier to go the other direction and implement something 
like Mma in Julia. This is what I have done here:

https://github.com/jlapeyre/SJulia.jl

I think Julia is uniquely well suited for implementing a Mma-like 
language.  I agree with the comment below that Mma is designed in part to 
appeal to non-programmers. A large part of its appeal is that it collects a 
lot of mathematics functionality that is hard to find elsewhere... all 
kinds of algorithms and special functions. Many of these can be used with 
one or a few lines of code. I kept the non-programmer in mind when writing 
SJulia.

   The question of what kind of type system a language has is somewhat 
polemic. In some sense, Mma is untyped. There is no hierarchy in 
expressions; they all have a head and arguments. I think hierarchies of 
mathematical objects are not well represented by hierarchies of programming 
language types. Which hierarchy a particular mathematical object belongs to 
and its exact definition is very fluid. Languages like Mma that attach no 
inherent meaning to expressions are well suited for mathematics for 
scientists and engineers. A matrix is an expression with head 'List' each 
of whose elements is an expression of fixed length with head 'List'.  Still 
types creep into Mma in various ways. 

Some people prefer types to play a larger role in symbolic computation. For 
instance:

http://www.sympy.org/en/index.html

https://github.com/jverzani/SymPy.jl

http://nemocas.org/

Whether to use types depends in part on the domain of the language.  But 
even for rather general math capabilities, language design determines in 
part the role of types. Sympy (in python and Julia) aim to add symbolic 
computation capability to Julia.  They are more 'typed' than Mma and 
SJulia. But, it seems that python sympy is hybrid in this respect and also 
supports generic expressions.
 

> To provide a starting point, here is the definition of type in Julia from 
> documentation http://docs.julialang.org/en/latest/manual/metaprogramming/
>
> type Expr
>   head::Symbol
>   args::Array{Any,1}
>   typend
>
>
> Maybe there's a typo in docs (line 4) but it doesn't really matter. What 
> do Julia users do, for example, to avoid boilerplate code defining lots of 
> types? I understand how to avoid writing boilerplate definitions in WL: it 
> may not be easy but at least I know where to begin in case I need a program 
> that would write new definitions or update existing ones.
>


[julia-users] Re: Running Julia in Ubuntu

2016-09-01 Thread Angshuman Goswami
I have downloaded the Julia 0.4.6 from the repository: 
https://github.com/JuliaLang/julia/releases/tag/v0.4.6
I extracted the folder and copied to opt folder
sudo ln -s /opt/julia-0.4.6/src/julia.h  /usr/local/bin

I made the folder executable using sudo chmod +x *

But I am getting the error:
bash: julia: command not found




On Thursday, September 1, 2016 at 5:38:10 PM UTC-4, Angshuman Goswami wrote:
>
> I want to use Julia 0.4.6. Can you guide me through the process as if I am 
> a novice
> On Thursday, September 1, 2016 at 2:24:43 AM UTC-4, Lutfullah Tomak wrote:
>>
>> You've already built julia I guess. You need to install python using 
>> ubuntu's package system. In command prompt
>> sudo apt-get install `pkg-name`
>> will install the package you want to install by asking you your password.
>> For python
>> sudo apt-get install python
>> will install python. Close prompt and open julia and try again building 
>> PyCall.jl by Pkg.build().
>>
>> On Wednesday, August 31, 2016 at 11:48:32 PM UTC+3, Angshuman Goswami 
>> wrote:
>>>
>>> I don't get how to do that. 
>>>
>>> Can you please tell me the steps. Its all too confusing and I am very 
>>> new to Ubuntu or Julia. Mostly used to work on Matlab. I have no idea how 
>>> to install dependancies
>>>
>>> On Wednesday, August 31, 2016 at 3:26:40 AM UTC-4, Kaj Wiik wrote:

 Ah, sorry, I assumed you are using x86_64. Find the arm binary tarball 
 and follow the instructions otherwise. See
 https://github.com/JuliaLang/julia/blob/master/README.arm.md


 On Wednesday, August 31, 2016 at 9:54:38 AM UTC+3, Lutfullah Tomak 
 wrote:
>
> You are on an arm cpu so Conda cannot install python for you. Also, 
> you tried downloading x86 cpu linux binaries, instead try arm nightlies.
> To get away with PyCall issues you have to manually install all 
> depencies. 
>
> On Wednesday, August 31, 2016 at 7:53:24 AM UTC+3, Angshuman Goswami 
> wrote:
>>
>> When i performed build again errors cropped up.
>>
>> Pkg.build("PyCall")
>> WARNING: unable to determine host cpu name.
>> INFO: Building PyCall
>> INFO: No system-wide Python was found; got the following error:
>> could not spawn `/usr/local/lib/python2.7 -c "import 
>> distutils.sysconfig; 
>> print(distutils.sysconfig.get_config_var('VERSION'))"`: permission 
>> denied 
>> (EACCES)
>> using the Python distribution in the Conda package
>> INFO: Downloading miniconda installer ...
>>   % Total% Received % Xferd  Average Speed   TimeTime 
>> Time  Current
>>  Dload  Upload   Total   Spent
>> Left  Speed
>> 100 24.7M  100 24.7M0 0  2401k  0  0:00:10  0:00:10 
>> --:--:-- 2743k
>> INFO: Installing miniconda ...
>> PREFIX=/home/odroid/.julia/v0.4/Conda/deps/usr
>> installing: _cache-0.0-py27_x0 ...
>> installing: python-2.7.11-0 ...
>> installing: conda-env-2.4.5-py27_0 ...
>> installing: openssl-1.0.2g-0 ...
>> installing: pycosat-0.6.1-py27_0 ...
>> installing: pyyaml-3.11-py27_1 ...
>> installing: readline-6.2-2 ...
>> installing: requests-2.9.1-py27_0 ...
>> installing: sqlite-3.9.2-0 ...
>> installing: tk-8.5.18-0 ...
>> installing: yaml-0.1.6-0 ...
>> installing: zlib-1.2.8-0 ...
>> installing: conda-4.0.5-py27_0 ...
>> installing: pycrypto-2.6.1-py27_0 ...
>> installing: pip-8.1.1-py27_1 ...
>> installing: wheel-0.29.0-py27_0 ...
>> installing: setuptools-20.3-py27_0 ...
>> /home/odroid/.julia/v0.4/Conda/deps/usr/installer.sh: line 288: 
>> /home/odroid/.julia/v0.4/Conda/deps/usr/pkgs/python-2.7.11-0/bin/python: 
>> cannot execute binary file: Exec format error
>> ERROR:
>> cannot execute native linux-32 binary, output from 'uname -a' is:
>> Linux odroid 3.10.69 #1 SMP PREEMPT Thu Feb 12 15:22:14 BRST 2015 
>> armv7l armv7l armv7l GNU/Linux
>> ===[ ERROR: PyCall 
>> ]
>>
>> LoadError: failed process: 
>> Process(`/home/odroid/.julia/v0.4/Conda/deps/usr/installer.sh -b -f -p 
>> /home/odroid/.julia/v0.4/Conda/deps/usr`, ProcessExited(1)) [1]
>> while loading /home/odroid/.julia/v0.4/PyCall/deps/build.jl, in 
>> expression starting on line 17
>>
>>
>> 
>>
>> [ BUILD ERRORS 
>> ]
>>
>> WARNING: PyCall had build errors.
>>
>>  - packages with build errors remain installed in 
>> /home/odroid/.julia/v0.4
>>  - build the package(s) and all dependencies with 
>> `Pkg.build("PyCall")`
>>  - build a single package by running its `deps/build.jl` script
>>
>>
>> ==

[julia-users] Re: Running Julia in Ubuntu

2016-09-01 Thread Angshuman Goswami
I want to use Julia 0.4.6. Can you guide me through the process as if I am 
a novice
On Thursday, September 1, 2016 at 2:24:43 AM UTC-4, Lutfullah Tomak wrote:
>
> You've already built julia I guess. You need to install python using 
> ubuntu's package system. In command prompt
> sudo apt-get install `pkg-name`
> will install the package you want to install by asking you your password.
> For python
> sudo apt-get install python
> will install python. Close prompt and open julia and try again building 
> PyCall.jl by Pkg.build().
>
> On Wednesday, August 31, 2016 at 11:48:32 PM UTC+3, Angshuman Goswami 
> wrote:
>>
>> I don't get how to do that. 
>>
>> Can you please tell me the steps. Its all too confusing and I am very new 
>> to Ubuntu or Julia. Mostly used to work on Matlab. I have no idea how to 
>> install dependancies
>>
>> On Wednesday, August 31, 2016 at 3:26:40 AM UTC-4, Kaj Wiik wrote:
>>>
>>> Ah, sorry, I assumed you are using x86_64. Find the arm binary tarball 
>>> and follow the instructions otherwise. See
>>> https://github.com/JuliaLang/julia/blob/master/README.arm.md
>>>
>>>
>>> On Wednesday, August 31, 2016 at 9:54:38 AM UTC+3, Lutfullah Tomak wrote:

 You are on an arm cpu so Conda cannot install python for you. Also, you 
 tried downloading x86 cpu linux binaries, instead try arm nightlies.
 To get away with PyCall issues you have to manually install all 
 depencies. 

 On Wednesday, August 31, 2016 at 7:53:24 AM UTC+3, Angshuman Goswami 
 wrote:
>
> When i performed build again errors cropped up.
>
> Pkg.build("PyCall")
> WARNING: unable to determine host cpu name.
> INFO: Building PyCall
> INFO: No system-wide Python was found; got the following error:
> could not spawn `/usr/local/lib/python2.7 -c "import 
> distutils.sysconfig; 
> print(distutils.sysconfig.get_config_var('VERSION'))"`: permission denied 
> (EACCES)
> using the Python distribution in the Conda package
> INFO: Downloading miniconda installer ...
>   % Total% Received % Xferd  Average Speed   TimeTime 
> Time  Current
>  Dload  Upload   Total   Spent
> Left  Speed
> 100 24.7M  100 24.7M0 0  2401k  0  0:00:10  0:00:10 
> --:--:-- 2743k
> INFO: Installing miniconda ...
> PREFIX=/home/odroid/.julia/v0.4/Conda/deps/usr
> installing: _cache-0.0-py27_x0 ...
> installing: python-2.7.11-0 ...
> installing: conda-env-2.4.5-py27_0 ...
> installing: openssl-1.0.2g-0 ...
> installing: pycosat-0.6.1-py27_0 ...
> installing: pyyaml-3.11-py27_1 ...
> installing: readline-6.2-2 ...
> installing: requests-2.9.1-py27_0 ...
> installing: sqlite-3.9.2-0 ...
> installing: tk-8.5.18-0 ...
> installing: yaml-0.1.6-0 ...
> installing: zlib-1.2.8-0 ...
> installing: conda-4.0.5-py27_0 ...
> installing: pycrypto-2.6.1-py27_0 ...
> installing: pip-8.1.1-py27_1 ...
> installing: wheel-0.29.0-py27_0 ...
> installing: setuptools-20.3-py27_0 ...
> /home/odroid/.julia/v0.4/Conda/deps/usr/installer.sh: line 288: 
> /home/odroid/.julia/v0.4/Conda/deps/usr/pkgs/python-2.7.11-0/bin/python: 
> cannot execute binary file: Exec format error
> ERROR:
> cannot execute native linux-32 binary, output from 'uname -a' is:
> Linux odroid 3.10.69 #1 SMP PREEMPT Thu Feb 12 15:22:14 BRST 2015 
> armv7l armv7l armv7l GNU/Linux
> ===[ ERROR: PyCall 
> ]
>
> LoadError: failed process: 
> Process(`/home/odroid/.julia/v0.4/Conda/deps/usr/installer.sh -b -f -p 
> /home/odroid/.julia/v0.4/Conda/deps/usr`, ProcessExited(1)) [1]
> while loading /home/odroid/.julia/v0.4/PyCall/deps/build.jl, in 
> expression starting on line 17
>
>
> 
>
> [ BUILD ERRORS 
> ]
>
> WARNING: PyCall had build errors.
>
>  - packages with build errors remain installed in 
> /home/odroid/.julia/v0.4
>  - build the package(s) and all dependencies with `Pkg.build("PyCall")`
>  - build a single package by running its `deps/build.jl` script
>
>
> 
>
>
> On Wednesday, August 31, 2016 at 12:08:33 AM UTC-4, Angshuman Goswami 
> wrote:
>>
>> julia> Pkg.status()
>> 7 required packages:
>>  - AmplNLWriter  0.2.2
>>  - CoinOptServices   0.1.2
>>  - IJulia1.2.0
>>  - Ipopt 0.2.4
>>  - JuMP  0.14.0
>>  - PyCall1.7.1
>>  - RobotOS   0.4.1
>> 19 additional packages:
>>  -

[julia-users] Re: Running Julia in Ubuntu

2016-09-01 Thread Angshuman Goswami
When I remove the queue I get this error.

ERROR: LoadError: Subscribing to a topic is currently broken on julia v0.5 
and above. See
https://github.com/jdlangs/RobotOS.jl/issues/15 for ongoing efforts to fix 
this.
while loading 
/home/odroid/barc/workspace/src/barc/src/controller_MPC_ARC01.jl, in 
expression starting on line 198


On Thursday, September 1, 2016 at 5:31:54 PM UTC-4, Angshuman Goswami wrote:
>
> Thanks everyone for the help.
>
> I managed to install Julia using Nightly builds . 
>
> Now I have new problems. When I run the program. I am getting this error.
>
> EXIT: Optimal Solution Found.
> finished initial solve!
>   ERROR: LoadError: MethodError: no method matching 
> RobotOS.Subscriber{T}(::String, ::Type{barc.msg.Z_KinBkMdl}, 
> ::#SE_callback; queue_size=10)
> Closest candidates are:
>   RobotOS.Subscriber{T}(::Any...) at 
> /home/odroid/.julia/v0.6/RobotOS/src/pubsub.jl:24 got an unsupported 
> keyword argument "queue_size"
> while loading 
> /home/odroid/barc/workspace/src/barc/src/controller_MPC_ARC01.jl, in 
> expression starting on line 198
>
>
> On Thursday, September 1, 2016 at 2:24:43 AM UTC-4, Lutfullah Tomak wrote:
>>
>> You've already built julia I guess. You need to install python using 
>> ubuntu's package system. In command prompt
>> sudo apt-get install `pkg-name`
>> will install the package you want to install by asking you your password.
>> For python
>> sudo apt-get install python
>> will install python. Close prompt and open julia and try again building 
>> PyCall.jl by Pkg.build().
>>
>> On Wednesday, August 31, 2016 at 11:48:32 PM UTC+3, Angshuman Goswami 
>> wrote:
>>>
>>> I don't get how to do that. 
>>>
>>> Can you please tell me the steps. Its all too confusing and I am very 
>>> new to Ubuntu or Julia. Mostly used to work on Matlab. I have no idea how 
>>> to install dependancies
>>>
>>> On Wednesday, August 31, 2016 at 3:26:40 AM UTC-4, Kaj Wiik wrote:

 Ah, sorry, I assumed you are using x86_64. Find the arm binary tarball 
 and follow the instructions otherwise. See
 https://github.com/JuliaLang/julia/blob/master/README.arm.md


 On Wednesday, August 31, 2016 at 9:54:38 AM UTC+3, Lutfullah Tomak 
 wrote:
>
> You are on an arm cpu so Conda cannot install python for you. Also, 
> you tried downloading x86 cpu linux binaries, instead try arm nightlies.
> To get away with PyCall issues you have to manually install all 
> depencies. 
>
> On Wednesday, August 31, 2016 at 7:53:24 AM UTC+3, Angshuman Goswami 
> wrote:
>>
>> When i performed build again errors cropped up.
>>
>> Pkg.build("PyCall")
>> WARNING: unable to determine host cpu name.
>> INFO: Building PyCall
>> INFO: No system-wide Python was found; got the following error:
>> could not spawn `/usr/local/lib/python2.7 -c "import 
>> distutils.sysconfig; 
>> print(distutils.sysconfig.get_config_var('VERSION'))"`: permission 
>> denied 
>> (EACCES)
>> using the Python distribution in the Conda package
>> INFO: Downloading miniconda installer ...
>>   % Total% Received % Xferd  Average Speed   TimeTime 
>> Time  Current
>>  Dload  Upload   Total   Spent
>> Left  Speed
>> 100 24.7M  100 24.7M0 0  2401k  0  0:00:10  0:00:10 
>> --:--:-- 2743k
>> INFO: Installing miniconda ...
>> PREFIX=/home/odroid/.julia/v0.4/Conda/deps/usr
>> installing: _cache-0.0-py27_x0 ...
>> installing: python-2.7.11-0 ...
>> installing: conda-env-2.4.5-py27_0 ...
>> installing: openssl-1.0.2g-0 ...
>> installing: pycosat-0.6.1-py27_0 ...
>> installing: pyyaml-3.11-py27_1 ...
>> installing: readline-6.2-2 ...
>> installing: requests-2.9.1-py27_0 ...
>> installing: sqlite-3.9.2-0 ...
>> installing: tk-8.5.18-0 ...
>> installing: yaml-0.1.6-0 ...
>> installing: zlib-1.2.8-0 ...
>> installing: conda-4.0.5-py27_0 ...
>> installing: pycrypto-2.6.1-py27_0 ...
>> installing: pip-8.1.1-py27_1 ...
>> installing: wheel-0.29.0-py27_0 ...
>> installing: setuptools-20.3-py27_0 ...
>> /home/odroid/.julia/v0.4/Conda/deps/usr/installer.sh: line 288: 
>> /home/odroid/.julia/v0.4/Conda/deps/usr/pkgs/python-2.7.11-0/bin/python: 
>> cannot execute binary file: Exec format error
>> ERROR:
>> cannot execute native linux-32 binary, output from 'uname -a' is:
>> Linux odroid 3.10.69 #1 SMP PREEMPT Thu Feb 12 15:22:14 BRST 2015 
>> armv7l armv7l armv7l GNU/Linux
>> ===[ ERROR: PyCall 
>> ]
>>
>> LoadError: failed process: 
>> Process(`/home/odroid/.julia/v0.4/Conda/deps/usr/installer.sh -b -f -p 
>> /home/odroid/.julia/v0.4/Conda/deps/usr`, ProcessExited(1)) [1]
>> while loading /home/odroid/.julia/v0.4/PyCall/deps/build.jl, in 
>> expression starting on line 17
>>

[julia-users] Re: Running Julia in Ubuntu

2016-09-01 Thread Angshuman Goswami
Thanks everyone for the help.

I managed to install Julia using Nightly builds . 

Now I have new problems. When I run the program. I am getting this error.

EXIT: Optimal Solution Found.
finished initial solve!
  ERROR: LoadError: MethodError: no method matching 
RobotOS.Subscriber{T}(::String, ::Type{barc.msg.Z_KinBkMdl}, 
::#SE_callback; queue_size=10)
Closest candidates are:
  RobotOS.Subscriber{T}(::Any...) at 
/home/odroid/.julia/v0.6/RobotOS/src/pubsub.jl:24 got an unsupported 
keyword argument "queue_size"
while loading 
/home/odroid/barc/workspace/src/barc/src/controller_MPC_ARC01.jl, in 
expression starting on line 198


On Thursday, September 1, 2016 at 2:24:43 AM UTC-4, Lutfullah Tomak wrote:
>
> You've already built julia I guess. You need to install python using 
> ubuntu's package system. In command prompt
> sudo apt-get install `pkg-name`
> will install the package you want to install by asking you your password.
> For python
> sudo apt-get install python
> will install python. Close prompt and open julia and try again building 
> PyCall.jl by Pkg.build().
>
> On Wednesday, August 31, 2016 at 11:48:32 PM UTC+3, Angshuman Goswami 
> wrote:
>>
>> I don't get how to do that. 
>>
>> Can you please tell me the steps. Its all too confusing and I am very new 
>> to Ubuntu or Julia. Mostly used to work on Matlab. I have no idea how to 
>> install dependancies
>>
>> On Wednesday, August 31, 2016 at 3:26:40 AM UTC-4, Kaj Wiik wrote:
>>>
>>> Ah, sorry, I assumed you are using x86_64. Find the arm binary tarball 
>>> and follow the instructions otherwise. See
>>> https://github.com/JuliaLang/julia/blob/master/README.arm.md
>>>
>>>
>>> On Wednesday, August 31, 2016 at 9:54:38 AM UTC+3, Lutfullah Tomak wrote:

 You are on an arm cpu so Conda cannot install python for you. Also, you 
 tried downloading x86 cpu linux binaries, instead try arm nightlies.
 To get away with PyCall issues you have to manually install all 
 depencies. 

 On Wednesday, August 31, 2016 at 7:53:24 AM UTC+3, Angshuman Goswami 
 wrote:
>
> When i performed build again errors cropped up.
>
> Pkg.build("PyCall")
> WARNING: unable to determine host cpu name.
> INFO: Building PyCall
> INFO: No system-wide Python was found; got the following error:
> could not spawn `/usr/local/lib/python2.7 -c "import 
> distutils.sysconfig; 
> print(distutils.sysconfig.get_config_var('VERSION'))"`: permission denied 
> (EACCES)
> using the Python distribution in the Conda package
> INFO: Downloading miniconda installer ...
>   % Total% Received % Xferd  Average Speed   TimeTime 
> Time  Current
>  Dload  Upload   Total   Spent
> Left  Speed
> 100 24.7M  100 24.7M0 0  2401k  0  0:00:10  0:00:10 
> --:--:-- 2743k
> INFO: Installing miniconda ...
> PREFIX=/home/odroid/.julia/v0.4/Conda/deps/usr
> installing: _cache-0.0-py27_x0 ...
> installing: python-2.7.11-0 ...
> installing: conda-env-2.4.5-py27_0 ...
> installing: openssl-1.0.2g-0 ...
> installing: pycosat-0.6.1-py27_0 ...
> installing: pyyaml-3.11-py27_1 ...
> installing: readline-6.2-2 ...
> installing: requests-2.9.1-py27_0 ...
> installing: sqlite-3.9.2-0 ...
> installing: tk-8.5.18-0 ...
> installing: yaml-0.1.6-0 ...
> installing: zlib-1.2.8-0 ...
> installing: conda-4.0.5-py27_0 ...
> installing: pycrypto-2.6.1-py27_0 ...
> installing: pip-8.1.1-py27_1 ...
> installing: wheel-0.29.0-py27_0 ...
> installing: setuptools-20.3-py27_0 ...
> /home/odroid/.julia/v0.4/Conda/deps/usr/installer.sh: line 288: 
> /home/odroid/.julia/v0.4/Conda/deps/usr/pkgs/python-2.7.11-0/bin/python: 
> cannot execute binary file: Exec format error
> ERROR:
> cannot execute native linux-32 binary, output from 'uname -a' is:
> Linux odroid 3.10.69 #1 SMP PREEMPT Thu Feb 12 15:22:14 BRST 2015 
> armv7l armv7l armv7l GNU/Linux
> ===[ ERROR: PyCall 
> ]
>
> LoadError: failed process: 
> Process(`/home/odroid/.julia/v0.4/Conda/deps/usr/installer.sh -b -f -p 
> /home/odroid/.julia/v0.4/Conda/deps/usr`, ProcessExited(1)) [1]
> while loading /home/odroid/.julia/v0.4/PyCall/deps/build.jl, in 
> expression starting on line 17
>
>
> 
>
> [ BUILD ERRORS 
> ]
>
> WARNING: PyCall had build errors.
>
>  - packages with build errors remain installed in 
> /home/odroid/.julia/v0.4
>  - build the package(s) and all dependencies with `Pkg.build("PyCall")`
>  - build a single package by running its `deps/build.jl` script
>
>
> ==

[julia-users] Re: When git master is preferred to the tagged version obtained by Pkg. add()

2016-09-01 Thread Tim Wheeler
Pkg.add fetches the latest published version of the package (which resides 
in Metadata). That is not necessarily the master version.
This is so that package developers working on new features don't 
inadvertently break a lot of folks' code. That being said, it is up to the 
package developer to tag and publish new versions to ensure that bug fixes 
reach users.

On Thursday, September 1, 2016 at 11:55:01 AM UTC-7, Colin Beckingham wrote:
>
> I just ran into the situation with a Julia package where testing with the 
> version of the package obtained by Pkg.add was broken with respect to the 
> test process. Pkg.checkout followed by Pkg.build gave me the master version 
> which worked fine in test. Does this necessarily imply that Pkg.add is 
> fetching the wrong version?
>


[julia-users] When git master is preferred to the tagged version obtained by Pkg. add()

2016-09-01 Thread Colin Beckingham
I just ran into the situation with a Julia package where testing with the 
version of the package obtained by Pkg.add was broken with respect to the 
test process. Pkg.checkout followed by Pkg.build gave me the master version 
which worked fine in test. Does this necessarily imply that Pkg.add is 
fetching the wrong version?


[julia-users] Re: @threads all vs. @parallel ???

2016-09-01 Thread Chris Rackauckas
If you @threads and each time you're in the loop you're acting on a 
different part of the array, it'll be thread-safe. 

I think the better examples are what's not thread-safe. If you're doing 
something like looping and tmp+=A[i], then each thread can grab a different 
tmp and write into the same spot, ultimately not summing up the components 
of A and getting the wrong answer. Another problem comes up with globals. 
You might have a loop where you're using BigFloats, and in the loop it 
might change the precision if a certain condition is met. However, since 
BigFloat precision is a global, it will change the precision for all of the 
threads. This becomes a problem because this behavior is non-deterministic: 
which threads will have been run before this computation depends on random 
factors about why certain threads ran slightly faster/slower. 

On Thursday, September 1, 2016 at 11:17:25 AM UTC-7, digxx wrote:
>
> Do you have a "simple" example of how to write something "thread safe" if 
> I plan to use @threads ?
>
> Am Mittwoch, 31. August 2016 07:35:06 UTC+2 schrieb Chris Rackauckas:
>>
>> That's pretty much it. Threads are shared memory, which have less 
>> overhead (and are thus faster), and can share variables between them. 
>> @parallel is multiprocessing, i.e. each worker process has its own set of 
>> defined variables which do not overlap, and data has to be transferred 
>> between them. @parallel has the advantage that it does not have to be 
>> local: different processes can be on completely different computers/nodes. 
>> But it has a higher startup cost and is thus better suited to larger 
>> parallelizable problems.
>>
>> However, as Yichao states, @threads is still experimental. For one, since 
>> the memory is shared, you have have to make sure everything is 
>> "thread-safe" in order to be correct and not fail (example: two threads 
>> can't write to the same spot at once or else it can be non-deterministic as 
>> to what the result is). But also, the internals still have some rough 
>> edges. Just check out the repo for bug reports and you'll see that things 
>> can still go wrong, or that your performance can even decrease due to 
>> type-inference bugs.  Thus 
>> it is something to play around with, but it definitely isn't something that 
>> you should put into production yet (though in many cases it is already 
>> looking pretty good!).
>>
>> On Tuesday, August 30, 2016 at 5:46:57 PM UTC-7, Andrew wrote:
>>>
>>> I have also been wondering this. I tried @threads yesterday and it got 
>>> me around a 4-fold speedup on a loop which applied a function to each 
>>> element in an array, and I conveniently didn't need to bother using 
>>> SharedArrays as I would with @parallel.
>>>
>>> On Tuesday, August 30, 2016 at 7:20:36 PM UTC-4, digxx wrote:

 Sorry if there is already some information on this though I didnt find 
 it...
 So: What is the difference between these?
 I have used @parallel so far for parallel loops but recently saw this 
 @threads all in some video and I was wondering what the difference is?
 Could anyone elaborate or give me a link with some info?
 Thanks digxx

>>>

Re: [julia-users] Re: @threads all vs. @parallel ???

2016-09-01 Thread Yichao Yu
On Thu, Sep 1, 2016 at 2:17 PM, digxx  wrote:

> Do you have a "simple" example of how to write something "thread safe" if
> I plan to use @threads ?
>

Some parts of the runtime are still not threasafe.
I also don't think there's a meaningful "simple example" of writing
something "thread safe". It strongly depend on what you want to do.


>
>
> Am Mittwoch, 31. August 2016 07:35:06 UTC+2 schrieb Chris Rackauckas:
>>
>> That's pretty much it. Threads are shared memory, which have less
>> overhead (and are thus faster), and can share variables between them.
>> @parallel is multiprocessing, i.e. each worker process has its own set of
>> defined variables which do not overlap, and data has to be transferred
>> between them. @parallel has the advantage that it does not have to be
>> local: different processes can be on completely different computers/nodes.
>> But it has a higher startup cost and is thus better suited to larger
>> parallelizable problems.
>>
>> However, as Yichao states, @threads is still experimental. For one, since
>> the memory is shared, you have have to make sure everything is
>> "thread-safe" in order to be correct and not fail (example: two threads
>> can't write to the same spot at once or else it can be non-deterministic as
>> to what the result is). But also, the internals still have some rough
>> edges. Just check out the repo for bug reports and you'll see that things
>> can still go wrong, or that your performance can even decrease due to
>> type-inference bugs.  Thus
>> it is something to play around with, but it definitely isn't something that
>> you should put into production yet (though in many cases it is already
>> looking pretty good!).
>>
>> On Tuesday, August 30, 2016 at 5:46:57 PM UTC-7, Andrew wrote:
>>>
>>> I have also been wondering this. I tried @threads yesterday and it got
>>> me around a 4-fold speedup on a loop which applied a function to each
>>> element in an array, and I conveniently didn't need to bother using
>>> SharedArrays as I would with @parallel.
>>>
>>> On Tuesday, August 30, 2016 at 7:20:36 PM UTC-4, digxx wrote:

 Sorry if there is already some information on this though I didnt find
 it...
 So: What is the difference between these?
 I have used @parallel so far for parallel loops but recently saw this
 @threads all in some video and I was wondering what the difference is?
 Could anyone elaborate or give me a link with some info?
 Thanks digxx

>>>


[julia-users] Re: @threads all vs. @parallel ???

2016-09-01 Thread digxx
Do you have a "simple" example of how to write something "thread safe" if I 
plan to use @threads ?

Am Mittwoch, 31. August 2016 07:35:06 UTC+2 schrieb Chris Rackauckas:
>
> That's pretty much it. Threads are shared memory, which have less overhead 
> (and are thus faster), and can share variables between them. @parallel is 
> multiprocessing, i.e. each worker process has its own set of defined 
> variables which do not overlap, and data has to be transferred between 
> them. @parallel has the advantage that it does not have to be local: 
> different processes can be on completely different computers/nodes. But it 
> has a higher startup cost and is thus better suited to larger 
> parallelizable problems.
>
> However, as Yichao states, @threads is still experimental. For one, since 
> the memory is shared, you have have to make sure everything is 
> "thread-safe" in order to be correct and not fail (example: two threads 
> can't write to the same spot at once or else it can be non-deterministic as 
> to what the result is). But also, the internals still have some rough 
> edges. Just check out the repo for bug reports and you'll see that things 
> can still go wrong, or that your performance can even decrease due to 
> type-inference bugs.  Thus 
> it is something to play around with, but it definitely isn't something that 
> you should put into production yet (though in many cases it is already 
> looking pretty good!).
>
> On Tuesday, August 30, 2016 at 5:46:57 PM UTC-7, Andrew wrote:
>>
>> I have also been wondering this. I tried @threads yesterday and it got me 
>> around a 4-fold speedup on a loop which applied a function to each element 
>> in an array, and I conveniently didn't need to bother using SharedArrays as 
>> I would with @parallel.
>>
>> On Tuesday, August 30, 2016 at 7:20:36 PM UTC-4, digxx wrote:
>>>
>>> Sorry if there is already some information on this though I didnt find 
>>> it...
>>> So: What is the difference between these?
>>> I have used @parallel so far for parallel loops but recently saw this 
>>> @threads all in some video and I was wondering what the difference is?
>>> Could anyone elaborate or give me a link with some info?
>>> Thanks digxx
>>>
>>

[julia-users] Re: Questions on parallelizing code - or how to deal with objects (and not just gathering data) in parallel.

2016-09-01 Thread Chris Rackauckas
Hey,
  There are some things that are changed in v0.5 so I would suggest that 
you would start this part of the project on v0.5. 

  That said, I think you have to build the tools yourself using the basic 
parallel macros. You might want to look into ParallelDataTransfer.jl 
. It's built 
off a solution from StackExchange awhile ago, though there is a relevant 
bug you'll need to help us squash 
. 
Anything you find helpful in this area I would love to have as a 
contribution to this package. It would be helpful to the community to have 
a curated repository of these functions/macros.

On Thursday, September 1, 2016 at 8:24:21 AM UTC-7, Sleort wrote:
>
> Hi,
>
> I am trying to figure out how to parallelize a slightly convoluted Monte 
> Carlo simulation in Julia (0.4.6), but have a hard time figuring out the 
> "best"/"recommended" way of doing it. The non-parallel program structure 
> goes like this:
>
>1. Initialize a (large) Monte Carlo state object (of its own type), 
>which is going to be updated using a Markov Chain Monte Carlo update 
>algorithm. Say,
>x =MCState()
>In my case this is NOT an array, but a linked list/graph structure. 
>The state object also contains some parameters, to be iteratively 
>determined.
>2. Do n Monte Carlo updates (which changes the state x) and gather 
>some data from this in a dataobject.
>for it=1:n
>doMCupdate!(x,dataobject)
>end
>3. Based on the gathered data, the parameters of the MC state should 
>be updated,
>updateparameters!(x,dataObject)
>4. Repeat from 2 until convergence by some measure.
>
> *Ideally*, the parallel code should read something like this:
>
>1. Initialize a Monte Carlo state object on each worker. The state is 
>large (in memory), so it should not be copied/moved around between workers.
>2. Do independent Monte Carlo updates on each worker, collecting the 
>data in independent dataobjects.
>3. Gather all the relevant data of the dataobjects on the master 
>process. Calculate what the new parameters should be based on these 
>(compared to the non-parallel case, statistically improved) data. 
>Distribute these parameters back to the Monte Carlo state objects on each 
>worker process.
>4. Repeat from 2 until convergence by some measure.
>
> The question is: What is the "best" way of accomplishing this in Julia? 
>
> As long as the entire program is wrapped within the same function/global 
> scope, the parallel case can be accomplished by the use of @everywhere, 
> @parallel for, and @eval @everywhere x.parameters = $newparameters (for 
> broadcasting the new parameters from the master to the workers). This 
> however, results in a long, ugly code, which probably isn't very efficient 
> from a compiler point of view. I would rather like to pass the parallel 
> MCstate objects between the various steps in the algorithm, like in the 
> non-parallel way. This could (should?) maybe be achieved with the use of 
> RemoteRefs? However, RemoteRefs are references to results of a calculation 
> rather than the objects on which the calculations are performed. The 
> objects could of course be accessed by clever use of identity functions, 
> the put() function etc., but again the approach seems rather 
> inelegant/"hackish" to me...
>
> To summarize/generalize: I'm wondering about how to deal with independent 
> objects defined on each worker process. How to pass them between functions 
> in parallel. How to gather information from them to the master process. How 
> to broadcast information from the master to the workers... To me, my 
> problem seems to be somewhat beyond the @parallel for, pmap and similar 
> "distribute calculations and gather the result and that's it" approaches 
> explained in the documentation and elsewhere. However, I'm sure there is a 
> natural way to deal with it in Julia. After all, I'm trying to a achieve a 
> rather generic parallel programming pattern.
>
> Any suggestions/ideas are very welcome!
>


Re: [julia-users] "Namespaces" for accessing fields of composite types

2016-09-01 Thread Michael Borregaard
To do something like this in Julia, there are several possibilities, 
depending what you want to achieve. You could

abstract Cat # we decide hat cat will always have an age variable

type Tiger <: Cat
  age::Int
end

meow(c::Cat) = println(c.age)

# then you can
tigre = Tiger(5)
meow(tigre)




[julia-users] Re: How to plot different histograms in one histogram ?

2016-09-01 Thread Steven G. Johnson


On Thursday, September 1, 2016 at 9:07:07 AM UTC-4, Ahmed Mazari wrote:
>
>
> the link talks about plotting in python . l tried to implement the code 
> but the function  dic is not recognized . the error returned is " dic not 
> defined"
>
>
> *common_params = dict(bins=20,  range=(-5, 5), 
>  normed="True")*
>

You have to translate to Julia syntax, of course.

(This is the big problem with PyPlot — most of the documentation is for 
Matplotlib, hence it is in Python, and therefore you have to know *both* 
Julia *and* Python in order to translate it effectively.)

In Julia, for example, you write Dict(bins=>20, range=>(-5,5), 
normed=>true) for the equivalent of the Python dict in the example on 
stackoverflow.


Re: [julia-users] Re: FYI: Second transpiler to Julia(?), from Ruby, and benchmarks

2016-09-01 Thread Yichao Yu
On Wed, Aug 31, 2016 at 4:08 AM,  wrote:

> The creator of virtual_module and ruby2julia transpiler here, just dropped
> in to see what's going on now. Thank you for your interest.
>
> > Is it including startup/compilation time? Did they not "run it twice"?
>
> Yes, it includes startup/compilation time.(I'm not sure if I understand
> "runt it twice" meaning properly though)
>
> > B. About Classes and
> > https://en.wikipedia.org/wiki/Composition_over_inheritance
> > that is I guess best, but maybe not to helpful for that project.. Should
> that be enough, to compile to that, or any other ideas?
>
> This idea will work as well. Still thinking what's the best, but it's
> possible anyway.
>
> > C. I'm sure Julia has as good decimal support as possible already, with
> two different packages. I'm not sure what's in Ruby (so can't comment on
> that code), I guess the maker of the project is not aware, only of what is
> in Base.
>
> Thanks to your comment, I have found the solution. Just use base(x, y)
> then any conversion could be done. Thank you.
>
> > That is https://github.com/Ken-B/RoR_julia_eg
> > that uses ZMQ.jl (better for IPC)?
>
> ZMQ sounds promising in order to add more concurrency to virtual_module.
>
> > And in practice it will probably be slower than the source language
> because Julia is not as heavily optimized for interpreting those semantics.
>
> True. And my experiment is to gain performance improvements in exchange
> for giving up completeness of accuracy of Ruby syntax. The project goal is
> something like "gain BIG performance improvement with more than 90% Ruby
> Syntax coverage", though not sure yet if I can make this happen. Anyways
> thank you for your comment.
>

This might be doable. Although be aware that 90% of syntax could mean <10%
of non-toy code, which might be good enough as a starting point is the goal
is to port code to julia but won't if the intent is to run the code in the
original language.


[julia-users] Re: FYI: Second transpiler to Julia(?), from Ruby, and benchmarks

2016-09-01 Thread Páll Haraldsson
On Thursday, September 1, 2016 at 3:24:21 PM UTC, k...@swd.cc wrote:
>
> The creator of virtual_module and ruby2julia transpiler here, just dropped 
> in to see what's going on now. Thank you for your interest.
>
> > Is it including startup/compilation time? Did they not "run it twice"?
>
> Yes, it includes startup/compilation time.(I'm not sure if I understand 
> "runt it twice" meaning properly though)
>

The first time you run Julia code, it's slower as then you include compile 
time. See: http://docs.julialang.org/en/latest/manual/performance-tips/

> That is https://github.com/Ken-B/RoR_julia_eg
> that uses ZMQ.jl (better for IPC)?

ZMQ sounds promising in order to add more concurrency to virtual_module.

>> And in practice it will probably be slower than the source language 
because Julia is not as heavily optimized for interpreting those semantics.

>True. And my experiment is to gain performance improvements in exchange 
for giving up completeness of accuracy of Ruby syntax. The project goal is 
something like "gain BIG performance improvement with more than 90% Ruby 
Syntax coverage", though not sure yet if I can make this happen. Anyways 
thank you for your comment.

Julia interop (not transpiling), e.g. RoR_julia_eg 
 helps to get speed. Transpiling 
would help to migrate code and/or if you are willing to modify the 
transpiled code, to get more speed than Ruby. As explained by Steven (that 
makes PyCall.jl, with tighter integration than already done for Ruby or any 
other dynamic language), you shouldn't get speedup, but that depends on how 
slow the implementation of Ruby is.. It seems you are not gaining from 
Julia, the *language* per se, only the BLAS functions that are actually 
written in Fortran (could have been fast though in Julia), so you're 
gaining from a library that could (in theory) be used directly from Ruby.

-- 
Palli.



[julia-users] Re: Saving and Loading data (when JLD is not suitable)

2016-09-01 Thread Páll Haraldsson
On Wednesday, August 31, 2016 at 4:47:37 AM UTC, Lyndon White wrote:
>
>
> There are 3 problems with using Base,serialize as a data storage format.
>
>1. *It is not stable* -- the format is evolving as the language 
>evolves, and it breaks the ability to load files
>2. *It is only usable from Julia* -- vs JLD which is, in the end fancy 
>HDF5, anything can read it after a little work
>3. *It is not safe from a security perspective*  -- Maliciously 
>crafted ".jsz" files can allow arbitrary code execution to occur during 
> the 
>deserialize step.
>
>
You just reminded me of:

http://prevayler.org/

that has been ported to other than the first language, Java, and I think 
would also be nice for Julia.. They solved a different problem than what 
you have (or the three above), and pointed to XQuery/XML, if I recall, for 
the query/[export out of "db"]serialization


http://www.onjava.com/pub/a/onjava/2005/06/08/prevayler.html

"A *prevalent* system makes use of serialization, and is again useful only 
when an in-memory data set is feasible. A serialized snapshot of a working 
system can be taken at regular intervals as a first-line storage mechanism. 
[..]
Prevayler 1.0 was awarded a JOLT Productivity  
award in 2004. The recent version, 2.0, has many improvements, including a 
simpler API."

https://github.com/jsampson/prevayler

-- 
Palli.



[julia-users] Re: FYI: Second transpiler to Julia(?), from Ruby, and benchmarks

2016-09-01 Thread k
The creator of virtual_module and ruby2julia transpiler here, just dropped 
in to see what's going on now. Thank you for your interest.

> Is it including startup/compilation time? Did they not "run it twice"?

Yes, it includes startup/compilation time.(I'm not sure if I understand 
"runt it twice" meaning properly though)

> B. About Classes and
> https://en.wikipedia.org/wiki/Composition_over_inheritance
> that is I guess best, but maybe not to helpful for that project.. Should 
that be enough, to compile to that, or any other ideas?

This idea will work as well. Still thinking what's the best, but it's 
possible anyway.

> C. I'm sure Julia has as good decimal support as possible already, with 
two different packages. I'm not sure what's in Ruby (so can't comment on 
that code), I guess the maker of the project is not aware, only of what is 
in Base.

Thanks to your comment, I have found the solution. Just use base(x, y) then 
any conversion could be done. Thank you.

> That is https://github.com/Ken-B/RoR_julia_eg
> that uses ZMQ.jl (better for IPC)?

ZMQ sounds promising in order to add more concurrency to virtual_module.

> And in practice it will probably be slower than the source language 
because Julia is not as heavily optimized for interpreting those semantics.

True. And my experiment is to gain performance improvements in exchange for 
giving up completeness of accuracy of Ruby syntax. The project goal is 
something like "gain BIG performance improvement with more than 90% Ruby 
Syntax coverage", though not sure yet if I can make this happen. Anyways 
thank you for your comment.


[julia-users] Questions on parallelizing code - or how to deal with objects (and not just gathering data) in parallel.

2016-09-01 Thread Sleort
Hi,

I am trying to figure out how to parallelize a slightly convoluted Monte 
Carlo simulation in Julia (0.4.6), but have a hard time figuring out the 
"best"/"recommended" way of doing it. The non-parallel program structure 
goes like this:

   1. Initialize a (large) Monte Carlo state object (of its own type), 
   which is going to be updated using a Markov Chain Monte Carlo update 
   algorithm. Say,
   x =MCState()
   In my case this is NOT an array, but a linked list/graph structure. The 
   state object also contains some parameters, to be iteratively determined.
   2. Do n Monte Carlo updates (which changes the state x) and gather some 
   data from this in a dataobject.
   for it=1:n
   doMCupdate!(x,dataobject)
   end
   3. Based on the gathered data, the parameters of the MC state should be 
   updated,
   updateparameters!(x,dataObject)
   4. Repeat from 2 until convergence by some measure.

*Ideally*, the parallel code should read something like this:

   1. Initialize a Monte Carlo state object on each worker. The state is 
   large (in memory), so it should not be copied/moved around between workers.
   2. Do independent Monte Carlo updates on each worker, collecting the 
   data in independent dataobjects.
   3. Gather all the relevant data of the dataobjects on the master 
   process. Calculate what the new parameters should be based on these 
   (compared to the non-parallel case, statistically improved) data. 
   Distribute these parameters back to the Monte Carlo state objects on each 
   worker process.
   4. Repeat from 2 until convergence by some measure.

The question is: What is the "best" way of accomplishing this in Julia? 

As long as the entire program is wrapped within the same function/global 
scope, the parallel case can be accomplished by the use of @everywhere, 
@parallel for, and @eval @everywhere x.parameters = $newparameters (for 
broadcasting the new parameters from the master to the workers). This 
however, results in a long, ugly code, which probably isn't very efficient 
from a compiler point of view. I would rather like to pass the parallel 
MCstate objects between the various steps in the algorithm, like in the 
non-parallel way. This could (should?) maybe be achieved with the use of 
RemoteRefs? However, RemoteRefs are references to results of a calculation 
rather than the objects on which the calculations are performed. The 
objects could of course be accessed by clever use of identity functions, 
the put() function etc., but again the approach seems rather 
inelegant/"hackish" to me...

To summarize/generalize: I'm wondering about how to deal with independent 
objects defined on each worker process. How to pass them between functions 
in parallel. How to gather information from them to the master process. How 
to broadcast information from the master to the workers... To me, my 
problem seems to be somewhat beyond the @parallel for, pmap and similar 
"distribute calculations and gather the result and that's it" approaches 
explained in the documentation and elsewhere. However, I'm sure there is a 
natural way to deal with it in Julia. After all, I'm trying to a achieve a 
rather generic parallel programming pattern.

Any suggestions/ideas are very welcome!


Re: [julia-users] Setting up Stan in Ubuntu

2016-09-01 Thread Rob J. Goedman
Chris,

Do you mind continuing this discussion in issue # 1 of Stan.jl as it is exactly 
the issue Doug raised there.

I’ve not helped by recently upgrading Stan.jl to work on Julia 0.5 and now a 
new install on Julia 0.4 gets the ‘old’ Stan 0.3.2 and Pkg.checkout(“Stan”) 
won’t work anymore on Julia 0.5 nor contain fixes as provided by Doug and Tomas 
for non-OSX usage.

The warning are ok for now (mostly switching packages to Julia 0.5- related).

Can you edit src/Stan.jl in .julia/v0.4/Stan?

If so, you could insert a line like:

global const CMDSTAN_HOME = ENV["CMDSTAN_HOME"]

Regards,
Rob

> On Sep 1, 2016, at 07:15, Christopher Fisher  wrote:
> 
> Once again, thanks for your help. After mastering the installation on Mac, 
> Linux seems to be a bit more tricky. Here is what I get after calling Stan
> 
> 
> WARNING: New definition 
> +(AbstractArray, DataArrays.DataArray) at 
> /home/dfish/.julia/v0.4/DataArrays/src/operators.jl:276
> is ambiguous with: 
> +(WoodburyMatrices.SymWoodbury, AbstractArray{T<:Any, 2}) at 
> /home/dfish/.julia/v0.4/WoodburyMatrices/src/SymWoodburyMatrices.jl:107.
> To fix, define 
> +(WoodburyMatrices.SymWoodbury, DataArrays.DataArray{T<:Any, 2})
> before the new definition.
> WARNING: New definition 
> +(AbstractArray, DataArrays.AbstractDataArray) at 
> /home/dfish/.julia/v0.4/DataArrays/src/operators.jl:300
> is ambiguous with: 
> +(WoodburyMatrices.SymWoodbury, AbstractArray{T<:Any, 2}) at 
> /home/dfish/.julia/v0.4/WoodburyMatrices/src/SymWoodburyMatrices.jl:107.
> To fix, define 
> +(WoodburyMatrices.SymWoodbury, DataArrays.AbstractDataArray{T<:Any, 2})
> before the new definition.
> WARNING: Base.String is deprecated, use AbstractString instead.
>   likely near /home/dfish/.julia/v0.4/Graphs/src/common.jl:3
> WARNING: Base.String is deprecated, use AbstractString instead.
>   likely near /home/dfish/.julia/v0.4/Graphs/src/dot.jl:80
> WARNING: New definition 
> promote_rule(Type{Mamba.ScalarLogical}, Type{##267#T<:Real}) at 
> /home/dfish/.julia/v0.4/Mamba/src/variate.jl:20
> is ambiguous with: 
> promote_rule(Type{#A<:Real}, Type{ForwardDiff.Dual{#N<:Any, #B<:Real}}) 
> at /home/dfish/.julia/v0.4/ForwardDiff/src/dual.jl:154.
> To fix, define 
> promote_rule(Type{Mamba.ScalarLogical}, Type{ForwardDiff.Dual{#N<:Any, 
> #B<:Real}})
> before the new definition.
> WARNING: New definition 
> promote_rule(Type{Mamba.ScalarStochastic}, Type{##270#T<:Real}) at 
> /home/dfish/.julia/v0.4/Mamba/src/variate.jl:20
> is ambiguous with: 
> promote_rule(Type{#A<:Real}, Type{ForwardDiff.Dual{#N<:Any, #B<:Real}}) 
> at /home/dfish/.julia/v0.4/ForwardDiff/src/dual.jl:154.
> To fix, define 
> promote_rule(Type{Mamba.ScalarStochastic}, Type{ForwardDiff.Dual{#N<:Any, 
> #B<:Real}})
> before the new definition.
> 
> 
> Environment variable CMDSTAN_HOME not found.
> Environment variable JULIA_SVG_BROWSER not found.
> 
> I'm not sure what the first warning is exactly. I get that for Gadfly but it 
> seems not to affect its functionality. 
> 
> When I submit CMDSTAN_HOME I get:
> 
> " "
> 
> Based on your folder structure, I pointed my variable to 
> 
> CMDSTAN_HOME="/home/dfish/cmdstan-2.11.0"
> 
> in the juliarc.jl file. 
> 
> A picture of my folder structure is included as an attachment. 
> 
> Thanks,
> 
> Chris 
> 
> 
> 
> On Thursday, September 1, 2016 at 9:36:50 AM UTC-4, Rob J Goedman wrote:
> Hi Chris,
> 
> Can you see what Ubuntu comes up with after:
> 
> 
> julia> using Stan
> INFO: Recompiling stale cache file 
> /Users/rob/.julia/lib/v0.5/Distributions.ji for module Distributions.
> INFO: Recompiling stale cache file /Users/rob/.julia/lib/v0.5/DataArrays.ji 
> for module DataArrays.
> INFO: Recompiling stale cache file /Users/rob/.julia/lib/v0.5/Gadfly.ji for 
> module Gadfly.
> INFO: Recompiling stale cache file /Users/rob/.julia/lib/v0.5/DataFrames.ji 
> for module DataFrames.
> INFO: Recompiling stale cache file 
> /Users/rob/.julia/lib/v0.5/KernelDensity.ji for module KernelDensity.
> 
> julia> CMDSTAN_HOME
> "/Users/rob/Projects/Stan/cmdstan"
> 
> This should point to the top level:
> 
> 
> 
> Regards,
> Rob
> 
>> On Sep 1, 2016, at 03:24, Christopher Fisher miamioh.edu 
>> > wrote:
>> 
>> Correction. The paths should have  " " around them. 
>> 
>> On Thursday, September 1, 2016 at 6:19:00 AM UTC-4, Christopher Fisher wrote:
>> Also, I've tried adding a path in the .juliarc file. Its not clear to me 
>> what subfolder in cmd stan it should point to, so I tried the following to 
>> no avail:
>> 
>> CMDSTAN_HOME=/home/dfish/cmdstan-2.11.0/stan_2.11.0
>> 
>> CMDSTAN_HOME=/home/dfish/cmdstan-2.11.0/src/cmdstan
>> 
>> CMDSTAN_HOME=/home/dfish/cmdstan-2.11.0/stan_2.11.0
>> 
>> 
>> On Thursday, September 1, 2016 at 5:55:50 AM UTC-4, Christopher Fisher wrote:
>> I am encountering a problem while trying to interface Stan through Julia. 
>> Right now I am using Julia .4.6, Stan .3.2, Mamba .9.2 and Ubuntu 16.04. I 
>> 

Re: [julia-users] Setting up Stan in Ubuntu

2016-09-01 Thread Christopher Fisher
Once again, thanks for your help. After mastering the installation on Mac, 
Linux seems to be a bit more tricky. Here is what I get after calling Stan


*WARNING: New definition *
*+(AbstractArray, DataArrays.DataArray) at 
/home/dfish/.julia/v0.4/DataArrays/src/operators.jl:276*
*is ambiguous with: *
*+(WoodburyMatrices.SymWoodbury, AbstractArray{T<:Any, 2}) at 
/home/dfish/.julia/v0.4/WoodburyMatrices/src/SymWoodburyMatrices.jl:107.*
*To fix, define *
*+(WoodburyMatrices.SymWoodbury, DataArrays.DataArray{T<:Any, 2})*
*before the new definition.*
*WARNING: New definition *
*+(AbstractArray, DataArrays.AbstractDataArray) at 
/home/dfish/.julia/v0.4/DataArrays/src/operators.jl:300*
*is ambiguous with: *
*+(WoodburyMatrices.SymWoodbury, AbstractArray{T<:Any, 2}) at 
/home/dfish/.julia/v0.4/WoodburyMatrices/src/SymWoodburyMatrices.jl:107.*
*To fix, define *
*+(WoodburyMatrices.SymWoodbury, DataArrays.AbstractDataArray{T<:Any, 
2})*
*before the new definition.*
*WARNING: Base.String is deprecated, use AbstractString instead.*
*  likely near /home/dfish/.julia/v0.4/Graphs/src/common.jl:3*
*WARNING: Base.String is deprecated, use AbstractString instead.*
*  likely near /home/dfish/.julia/v0.4/Graphs/src/dot.jl:80*
*WARNING: New definition *
*promote_rule(Type{Mamba.ScalarLogical}, Type{##267#T<:Real}) at 
/home/dfish/.julia/v0.4/Mamba/src/variate.jl:20*
*is ambiguous with: *
*promote_rule(Type{#A<:Real}, Type{ForwardDiff.Dual{#N<:Any, 
#B<:Real}}) at /home/dfish/.julia/v0.4/ForwardDiff/src/dual.jl:154.*
*To fix, define *
*promote_rule(Type{Mamba.ScalarLogical}, Type{ForwardDiff.Dual{#N<:Any, 
#B<:Real}})*
*before the new definition.*
*WARNING: New definition *
*promote_rule(Type{Mamba.ScalarStochastic}, Type{##270#T<:Real}) at 
/home/dfish/.julia/v0.4/Mamba/src/variate.jl:20*
*is ambiguous with: *
*promote_rule(Type{#A<:Real}, Type{ForwardDiff.Dual{#N<:Any, 
#B<:Real}}) at /home/dfish/.julia/v0.4/ForwardDiff/src/dual.jl:154.*
*To fix, define *
*promote_rule(Type{Mamba.ScalarStochastic}, 
Type{ForwardDiff.Dual{#N<:Any, #B<:Real}})*
*before the new definition.*


*Environment variable CMDSTAN_HOME not found.*
*Environment variable JULIA_SVG_BROWSER not found.*

I'm not sure what the first warning is exactly. I get that for Gadfly but 
it seems not to affect its functionality. 

When I submit CMDSTAN_HOME I get:

" "

Based on your folder structure, I pointed my variable to 

CMDSTAN_HOME="/home/dfish/cmdstan-2.11.0"

in the juliarc.jl file. 

A picture of my folder structure is included as an attachment. 

Thanks,

Chris 



On Thursday, September 1, 2016 at 9:36:50 AM UTC-4, Rob J Goedman wrote:
>
> Hi Chris,
>
> Can you see what Ubuntu comes up with after:
>
>
> *julia> **using Stan*
> *INFO: Recompiling stale cache file 
> /Users/rob/.julia/lib/v0.5/Distributions.ji for module Distributions.*
> *INFO: Recompiling stale cache file 
> /Users/rob/.julia/lib/v0.5/DataArrays.ji for module DataArrays.*
> *INFO: Recompiling stale cache file /Users/rob/.julia/lib/v0.5/Gadfly.ji 
> for module Gadfly.*
> *INFO: Recompiling stale cache file 
> /Users/rob/.julia/lib/v0.5/DataFrames.ji for module DataFrames.*
> *INFO: Recompiling stale cache file 
> /Users/rob/.julia/lib/v0.5/KernelDensity.ji for module KernelDensity.*
>
> *julia> **CMDSTAN_HOME*
> *"/Users/rob/Projects/Stan/cmdstan"*
>
> This should point to the top level:
>
>
> Regards,
> Rob
>
> On Sep 1, 2016, at 03:24, Christopher Fisher  > wrote:
>
> Correction. The paths should have  " " around them. 
>
> On Thursday, September 1, 2016 at 6:19:00 AM UTC-4, Christopher Fisher 
> wrote:
>>
>> Also, I've tried adding a path in the .juliarc file. Its not clear to me 
>> what subfolder in cmd stan it should point to, so I tried the following to 
>> no avail:
>>
>> CMDSTAN_HOME=/home/dfish/cmdstan-2.11.0/stan_2.11.0
>>
>> CMDSTAN_HOME=/home/dfish/cmdstan-2.11.0/src/cmdstan
>>
>> CMDSTAN_HOME=/home/dfish/cmdstan-2.11.0/stan_2.11.0
>>
>>
>> On Thursday, September 1, 2016 at 5:55:50 AM UTC-4, Christopher Fisher 
>> wrote:
>>>
>>> I am encountering a problem while trying to interface Stan through 
>>> Julia. Right now I am using Julia .4.6, Stan .3.2, Mamba .9.2 and Ubuntu 
>>> 16.04. I have successfully installed cmd Stan and ran the Bernoulli test 
>>> model. However, when I run Pkg.test("Stan") I get the following error:
>>>
>>> Environment variable CMDSTAN_HOME not found.
>>> Environment variable JULIA_SVG_BROWSER not found.
>>>
>>> Based on the Mac instructions, I tried placing the following in the 
>>> .bashrc file
>>>
>>> export STAN_HOME=/home/dfish/cmdstan-2.11.0/stan_2.11.0/stan
>>> export CMDSTAN_HOME=/home/dfish/cmdstan-2.11.0/stan_2.11.0
>>> export JAGS_HOME=/usr/local/bin/jags
>>>
>>> Perhaps not surprisingly, that did not work. Any help would be 
>>> appreciated.
>>>
>>> -Chris 
>>>
>>
>

Stan Folders
Description: Binary data


[julia-users] Re: How to plot different histograms in one histogram ?

2016-09-01 Thread Ahmed Mazari

the link talks about plotting in python . l tried to implement the code but 
the function  dic is not recognized . the error returned is " dic not 
defined"




*common_params = dict(bins=20,  range=(-5, 5), 
 normed="True")*
On Thursday, September 1, 2016 at 2:50:09 PM UTC+2, Steven G. Johnson wrote:
>
>  
> http://stackoverflow.com/questions/9497524/displaying-3-histograms-on-1-axis-in-a-legible-way-matplotlib
>


[julia-users] Re: How to store graphics in julia ? png and pdf

2016-09-01 Thread Ahmed Mazari
so the  structure to save histograms in different png file will be as 
follows :

using PyPlot

for i in 1:1000
h = plt[:hist](x[i],40) # Histogram
*savefig("h$i.png")*
end 

is it like this ? call h at each iteration ! 
h1 . h1000 
On Thursday, September 1, 2016 at 2:48:27 PM UTC+2, Steven G. Johnson wrote:
>
>
>
> On Thursday, September 1, 2016 at 8:22:50 AM UTC-4, Ahmed Mazari wrote:
>>
>> Hello,
>>
>> l want to know how to directly from a code save my histograms in my 
>> repertory.
>>
>>
>> using PyPlot
>>
>> for i in 1:1000
>> h = plt[:hist](x[i],40) # Histogram
>> # how to store each x[i] in a png format in my desktop
>>
>>
> savefig("myfile$i.png")
>
> (exactly as in Matplotlib). 
>


[julia-users] Re: How to plot different histograms in one histogram ?

2016-09-01 Thread Steven G. Johnson
 
http://stackoverflow.com/questions/9497524/displaying-3-histograms-on-1-axis-in-a-legible-way-matplotlib


[julia-users] Re: How to store graphics in julia ? png and pdf

2016-09-01 Thread Steven G. Johnson


On Thursday, September 1, 2016 at 8:22:50 AM UTC-4, Ahmed Mazari wrote:
>
> Hello,
>
> l want to know how to directly from a code save my histograms in my 
> repertory.
>
>
> using PyPlot
>
> for i in 1:1000
> h = plt[:hist](x[i],40) # Histogram
> # how to store each x[i] in a png format in my desktop
>
>
savefig("myfile$i.png")

(exactly as in Matplotlib). 


[julia-users] How to plot different histograms in one histogram ?

2016-09-01 Thread Ahmed Mazari
Hello, 
l have different function to plot but l want them in one histogram so that 
to be able to compare.


h1 = plt[:hist](x,40) # Histogram
h2 = plt[:hist](y,40) # Histogram
h2 = plt[:hist](z,40) # Histogram
h4 = plt[:hist](k,40) # Histogram
h5 = plt[:hist](m,40) # Histogram


bins of h1 : red
bins of h2 : green 

and so on 

Is it possible to do that ?

thank you


[julia-users] How to store graphics in julia ? png and pdf

2016-09-01 Thread Ahmed Mazari
Hello,

l want to know how to directly from a code save my histograms in my 
repertory.


using PyPlot

for i in 1:1000
h = plt[:hist](x[i],40) # Histogram
# how to store each x[i] in a png format in my desktop

end


 l want to have in my desktop histograms saved in different png file then 
all the histograms in one pdf file.

x1.png
x2.png
x3.png
x4.png
.
.
.x1000.png


and all the histograms from 1 to 1000 in one pdf file  x.pdf

thank you 


[julia-users] Re: Announcing TensorFlow.jl, an interface to Google's TensorFlow machine learning library

2016-09-01 Thread Páll Haraldsson
On Wednesday, August 31, 2016 at 10:31:58 PM UTC, Jonathan Malmaud wrote:
>
> Hello,
> I'm pleased to announce the release of TensorFlow.jl, enabling modern 
> GPU-accelerated deep learning for Julia. Simply run Pkg.add("TensorFlow") 
> to install and then read through the documentation at 
> https://malmaud.github.io/tfdocs/index.html to get started. Please file 
> any issues you encounter at https://github.com/malmaud/TensorFlow.jl. 
>

About: "To enable support for GPU usage (Linux only)"

[I'm not complaining.. more asking about the general issue] It seems 
official Tensorflow supports (for *GPU*) OS X (but not Windows).

https://www.tensorflow.org/versions/r0.10/get_started/os_setup.html

Do you not support the same because you simply say do not have OS X or is 
this about some OS support missing from other packages?

>will ensure Julia remain a first-class citizen in world of modern machine 
learning

That would be great! You mean it's already ("remain") with other ANN 
packages (or yours, not your first version)? I guess people/you want 
Tensorflow (at least known in the Python world), I know nothing about it or 
the (Julia-native or otherwise..) competition.

-- 
Palli.



Re: [julia-users] Re: Announcing TensorFlow.jl, an interface to Google's TensorFlow machine learning library

2016-09-01 Thread Jonathan Malmaud
Thanks Kyunghun! It mostly uses the TensorFlow C library. It does rely on
PyCall for now for the autodifferentiation functionality, which is not yet
part of the C API. Google has said that the C API will soon expose AD
functionality, at which point this package won't depend on Python at all.

On Thu, Sep 1, 2016 at 7:17 AM Páll Haraldsson 
wrote:

> On Thursday, September 1, 2016 at 7:45:05 AM UTC, Kyunghun Kim wrote:
>>
>> Wonderful jobs, Jonathan!
>> I'd better try this version rather than use TensorFlow in python.
>>
>> Does it based on PyCall package?
>>
>
> Yes, you can always see that by looking in the REQUIRE file (except if
> used indirectly.. then you would have too look at packages recursively..).
>
>
> I guess you can also see at runtime, by what library (meaning libpython,
> not general Julia packages, maybe strace would then help..) is used..
>
> like [I know there's a command for it and this seems to be it]:
>
> pldd  |grep libpython
>
>
> [see also lsof command.]
>
>
> --
> Palli.
>
>


[julia-users] Re: Announcing TensorFlow.jl, an interface to Google's TensorFlow machine learning library

2016-09-01 Thread Páll Haraldsson
On Thursday, September 1, 2016 at 7:45:05 AM UTC, Kyunghun Kim wrote:
>
> Wonderful jobs, Jonathan! 
> I'd better try this version rather than use TensorFlow in python. 
>
> Does it based on PyCall package?
>

Yes, you can always see that by looking in the REQUIRE file (except if used 
indirectly.. then you would have too look at packages recursively..).


I guess you can also see at runtime, by what library (meaning libpython, 
not general Julia packages, maybe strace would then help..) is used..

like [I know there's a command for it and this seems to be it]:

pldd  |grep libpython


[see also lsof command.]

-- 
Palli.



Re: [julia-users] "Namespaces" for accessing fields of composite types

2016-09-01 Thread Yichao Yu
On Thu, Sep 1, 2016 at 5:51 AM, lars klein 
wrote:

> I'm currently revisiting Julia.
> The language seems almost to good to be true.
>
> One thing that irks me is the lack of OOP.
> Some time ago, I read up on all the information regarding this design
> choice.
> I don't want to debate it. Apparently that was a very informed and
> conscious decision, to keep the language lean and fast.
>
> However, while coding in Julia, I thought about this.
>
> type Cat
>   age::Int
> end
>
> type Tiger
>   innerCat::Cat
> end
>
>
>
> In a "classical" OOP language, you would derive tiger from cat and say
> that it's an "is-a" relationship.
> In Julia, you have to use composition instead.
>
> Assuming you have an instance of type tiger, accessing the age of the
> tiger becomes this.
>
> tiger.innerCat.age
>
> What about making the innerCat transparent ?
> Like this:
>
> tiger.age
>
> Is that feasible ?
> Since Julia is a compiled language, it seems to me like this would be a
> simple job for the compiler.
> I realize that there might be name-clashes in existing code.
> But that could be prevented by implementing the rule: The field age is
> searched top-down in the composite structure.
>
> I don't know the implications of this design change.
> And I don't want to presume.
>
> Did you think about this ?
> Did you decide not to include it, due to a specific reason ?
> Is the feature more complex than I realize ?
>
> If this is possible, the next thing to consider would be adapting multiple
> dispatch, too.
>
> function meow(c::Cat)
>   # age wasn't such a good field choice after all
>   println(age)
> end
>
>
> Maybe you could introduce the rule that
> meow(tiger)
>
> is implicitly converted to
> meow(tiger.cat)
>
>
> Once again.
> I know how it can be annoying to read the contributions of know-it-alls
> that actually have no clue what is going on.
> If you say "this is impossible", or even just "we can't do it in a nice
> way", I have complete understanding.
>
> But right now I'm rather enthusiastic about these changes.
> The second change should be possible via the compiler, too ?
> Right now you have generic functions. I think that this already implies
> lots of searching for adequate types. Searching in hierarchies of types
> should be possible ?
>
> The two changes would effectively make Julia a first-class OOP language.
> You could add a cat object to a tiger and the tiger would behave just like
> a cat.
> To my mind, this would be a great improvement in syntax.
> While keeping the semantics easy to understand.
>
>
The first change will be complicated. Changing the meaning of `.` is not as
simple as just chaining the name space. Function call is a known
complicated thing and it'll be very bad to make each field as expensive as
a function call.

The second change is basically impossible.

It's agreed that the namespace should be improved and there should be some
changes on this before 1.0. Simply allowing overloading `.` or changing the
meaning of it in a way that makes it much more expensive is unlikely the
way to go (although we might have something with similar function).


[julia-users] Re: Setting up Stan in Ubuntu

2016-09-01 Thread Christopher Fisher
Correction. The paths should have  " " around them. 

On Thursday, September 1, 2016 at 6:19:00 AM UTC-4, Christopher Fisher 
wrote:
>
> Also, I've tried adding a path in the .juliarc file. Its not clear to me 
> what subfolder in cmd stan it should point to, so I tried the following to 
> no avail:
>
> CMDSTAN_HOME=/home/dfish/cmdstan-2.11.0/stan_2.11.0
>
> CMDSTAN_HOME=/home/dfish/cmdstan-2.11.0/src/cmdstan
>
> CMDSTAN_HOME=/home/dfish/cmdstan-2.11.0/stan_2.11.0
>
>
> On Thursday, September 1, 2016 at 5:55:50 AM UTC-4, Christopher Fisher 
> wrote:
>>
>> I am encountering a problem while trying to interface Stan through Julia. 
>> Right now I am using Julia .4.6, Stan .3.2, Mamba .9.2 and Ubuntu 16.04. I 
>> have successfully installed cmd Stan and ran the Bernoulli test model. 
>> However, when I run Pkg.test("Stan") I get the following error:
>>
>> Environment variable CMDSTAN_HOME not found.
>> Environment variable JULIA_SVG_BROWSER not found.
>>
>> Based on the Mac instructions, I tried placing the following in the 
>> .bashrc file
>>
>> export STAN_HOME=/home/dfish/cmdstan-2.11.0/stan_2.11.0/stan
>> export CMDSTAN_HOME=/home/dfish/cmdstan-2.11.0/stan_2.11.0
>> export JAGS_HOME=/usr/local/bin/jags
>>
>> Perhaps not surprisingly, that did not work. Any help would be 
>> appreciated.
>>
>> -Chris 
>>
>

[julia-users] Re: Setting up Stan in Ubuntu

2016-09-01 Thread Christopher Fisher
Also, I've tried adding a path in the .juliarc file. Its not clear to me 
what subfolder in cmd stan it should point to, so I tried the following to 
no avail:

CMDSTAN_HOME=/home/dfish/cmdstan-2.11.0/stan_2.11.0

CMDSTAN_HOME=/home/dfish/cmdstan-2.11.0/src/cmdstan

CMDSTAN_HOME=/home/dfish/cmdstan-2.11.0/stan_2.11.0


On Thursday, September 1, 2016 at 5:55:50 AM UTC-4, Christopher Fisher 
wrote:
>
> I am encountering a problem while trying to interface Stan through Julia. 
> Right now I am using Julia .4.6, Stan .3.2, Mamba .9.2 and Ubuntu 16.04. I 
> have successfully installed cmd Stan and ran the Bernoulli test model. 
> However, when I run Pkg.test("Stan") I get the following error:
>
> Environment variable CMDSTAN_HOME not found.
> Environment variable JULIA_SVG_BROWSER not found.
>
> Based on the Mac instructions, I tried placing the following in the 
> .bashrc file
>
> export STAN_HOME=/home/dfish/cmdstan-2.11.0/stan_2.11.0/stan
> export CMDSTAN_HOME=/home/dfish/cmdstan-2.11.0/stan_2.11.0
> export JAGS_HOME=/usr/local/bin/jags
>
> Perhaps not surprisingly, that did not work. Any help would be appreciated.
>
> -Chris 
>


[julia-users] Setting up Stan in Ubuntu

2016-09-01 Thread Christopher Fisher
I am encountering a problem while trying to interface Stan through Julia. 
Right now I am using Julia .4.6, Stan .3.2, Mamba .9.2 and Ubuntu 16.04. I 
have successfully installed cmd Stan and ran the Bernoulli test model. 
However, when I run Pkg.test("Stan") I get the following error:

Environment variable CMDSTAN_HOME not found.
Environment variable JULIA_SVG_BROWSER not found.

Based on the Mac instructions, I tried placing the following in the .bashrc 
file

export STAN_HOME=/home/dfish/cmdstan-2.11.0/stan_2.11.0/stan
export CMDSTAN_HOME=/home/dfish/cmdstan-2.11.0/stan_2.11.0
export JAGS_HOME=/usr/local/bin/jags

Perhaps not surprisingly, that did not work. Any help would be appreciated.

-Chris 


[julia-users] "Namespaces" for accessing fields of composite types

2016-09-01 Thread lars klein
I'm currently revisiting Julia.
The language seems almost to good to be true.

One thing that irks me is the lack of OOP.
Some time ago, I read up on all the information regarding this design 
choice.
I don't want to debate it. Apparently that was a very informed and 
conscious decision, to keep the language lean and fast.

However, while coding in Julia, I thought about this.

type Cat
  age::Int
end

type Tiger
  innerCat::Cat
end



In a "classical" OOP language, you would derive tiger from cat and say that 
it's an "is-a" relationship.
In Julia, you have to use composition instead.

Assuming you have an instance of type tiger, accessing the age of the tiger 
becomes this.

tiger.innerCat.age

What about making the innerCat transparent ?
Like this:

tiger.age

Is that feasible ?
Since Julia is a compiled language, it seems to me like this would be a 
simple job for the compiler.
I realize that there might be name-clashes in existing code.
But that could be prevented by implementing the rule: The field age is 
searched top-down in the composite structure.

I don't know the implications of this design change.
And I don't want to presume.

Did you think about this ?
Did you decide not to include it, due to a specific reason ?
Is the feature more complex than I realize ?

If this is possible, the next thing to consider would be adapting multiple 
dispatch, too.

function meow(c::Cat)
  # age wasn't such a good field choice after all
  println(age)
end


Maybe you could introduce the rule that
meow(tiger)

is implicitly converted to 
meow(tiger.cat)


Once again.
I know how it can be annoying to read the contributions of know-it-alls 
that actually have no clue what is going on.
If you say "this is impossible", or even just "we can't do it in a nice 
way", I have complete understanding.

But right now I'm rather enthusiastic about these changes.
The second change should be possible via the compiler, too ?
Right now you have generic functions. I think that this already implies 
lots of searching for adequate types. Searching in hierarchies of types 
should be possible ?

The two changes would effectively make Julia a first-class OOP language.
You could add a cat object to a tiger and the tiger would behave just like 
a cat.
To my mind, this would be a great improvement in syntax.
While keeping the semantics easy to understand.



[julia-users] Re: Announcing TensorFlow.jl, an interface to Google's TensorFlow machine learning library

2016-09-01 Thread Kyunghun Kim
Wonderful jobs, Jonathan! 
I'd better try this version rather than use TensorFlow in python. 

Does it based on PyCall package? 

-Kyunghun

2016년 9월 1일 목요일 오전 7시 31분 58초 UTC+9, Jonathan Malmaud 님의 말:
>
> Hello,
> I'm pleased to announce the release of TensorFlow.jl, enabling modern 
> GPU-accelerated deep learning for Julia. Simply run Pkg.add("TensorFlow") 
> to install and then read through the documentation at 
> https://malmaud.github.io/tfdocs/index.html to get started. Please file 
> any issues you encounter at https://github.com/malmaud/TensorFlow.jl. 
>
> TensorFlow.jl offers a convenient Julian interface to Google's TensorFlow 
> library. It includes functionality for building up a computation graph that 
> encodes a deep-learning model and automatically minimizing an arbitrary 
> loss function with respect to the model parameters. Support is included for 
> convolutional networks, recurrent networks with LSTMs, the Adam 
> optimization algorithm, loading images, and checkpointing model parameters 
> to disk during training
>
> I'm hopeful that this package will ensure Julia remain a first-class 
> citizen in world of modern machine learning and look forward to the 
> community's help in getting it to match or exceed the capabilities of the 
> official Python TensorFlow API. 
>
> -Jon
>