[julia-users] Re: ANN: Book --- Julia Programming for Operations Research

2016-10-05 Thread mmh
I haven't read to book, but I just wanted to say lovely cover! Care to 
share some more details behind its origin and design.

On Thursday, June 2, 2016 at 9:01:30 AM UTC-4, Chang Kwon wrote:
>
> I wrote a book on julia programming, focusing on optimization problems 
> from operations research and management science.
>
> http://www.chkwon.net/julia/
>
> 
>
>
> I think this book will be useful for first-year graduate students and 
> advanced undergraduate students in operations research and related 
> disciplines who need to solve optimization problems, as well as 
> practitioners in a similar situation. 
>
> Here is a table of contents:
>
>
>1. Introduction and Installation
>2. Simple Linear Optimization
>3. Basics of the Julia Language
>4. Selected Topics in Numerical Methods
>5. The Simplex Method
>6. Network Optimization Problems
>7. General Optimization Problems
>8. Monte Carlo Methods
>9. Lagrangian Relaxation
>10. Parameters in Optimization Solvers
>11. Useful and Related Packages
>
> I believe this book will make a good reference for various courses like 
>
>
>- Introduction to Operations Research
>- Operations Research I, or Deterministic Operations Research
>- Linear Programming
>- Network Optimization
>- Nonlinear Programming
>- Convex Optimization
>- Numerical Optimization
>- Transportation Modeling
>- and any other courses involving optimization problems
>
> The book is available in the formats of online HTML, paperback, and 
> kindle. 
>
> If you have any questions, please feel free to send me a message.
>
> Best,
> Chang
>
>
>

Re: [julia-users] Simple test functions generates different codegen

2016-10-04 Thread mmh
Ah right, I forgot @code_lowered even existed , thanks for that. Yeah 
gcc/clang all have the same native code from this snippet, which is why I 
was surprised that the same julia code was produced different  code native. 
 

On Tuesday, October 4, 2016 at 9:13:17 AM UTC-4, Isaiah wrote:
>
> These expressions are lowered differently because `test2` gets a temporary 
> due to the conditional reassignment of `u`, whereas `test1` is just a 
> straight line switch and jump (look at `code_lowered` and `code_typed`).
>
> For the same C code, the lowered IR from Clang looks similar, but it 
> appears to constant fold and reduce down to identical assembly at `-O1` and 
> above. The fact that Julia doesn't is probably due to difference in LLVM 
> optimization passes or order.
>
> As far as style, personally I think the first one is cleaner.
>
> On Fri, Sep 30, 2016 at 1:48 PM, mmh <mum...@gmail.com > 
> wrote:
>
>> I would have that thought that these two function would produce the same 
>> code, but they do not.
>>
>> Could someone care to explain the difference and which is preferred and 
>> why
>>
>>
>> http://pastebin.com/GJ8YPfV3
>>
>> function test1(x)
>> y = 2.0
>> u = 2.320
>> x < 0 && (u = 32.0)
>> x > 1 && (u = 1.0)
>> return u + y
>> end
>>
>>
>> function test2(x)
>> y = 2.0
>> u = 2.320
>> u = x < 0 ? 32.0 : u
>> u = x > 1 ? 1.0 : u
>> return u + y
>> end
>>
>>
>> @code_llvm test1(2.2)
>>
>> @code_llvm test2(2.2)
>>
>>
>

Re: [julia-users] Re: Julia-i18n logo proposal

2016-10-04 Thread mmh
Cool logo, IMO the "J" at the top looks a little out of place and not 
balanced with the other two glyphs



[julia-users] Simple test functions generates different codegen

2016-09-30 Thread mmh
I would have that thought that these two function would produce the same 
code, but they do not.

Could someone care to explain the difference and which is preferred and why


http://pastebin.com/GJ8YPfV3

function test1(x)
y = 2.0
u = 2.320
x < 0 && (u = 32.0)
x > 1 && (u = 1.0)
return u + y
end


function test2(x)
y = 2.0
u = 2.320
u = x < 0 ? 32.0 : u
u = x > 1 ? 1.0 : u
return u + y
end


@code_llvm test1(2.2)

@code_llvm test2(2.2)



Re: [julia-users] Re: ANN: A potential new Discourse-based Julia forum

2016-09-22 Thread mmh
http://julia.malmaud.com 
<http://www.google.com/url?q=http%3A%2F%2Fjulia.malmaud.com=D=1=AFQjCNFCXzcz9SUqslP2iBzUCOJuzZZNmw>

Now links to some random dudes website :P

On Monday, September 19, 2016 at 3:39:34 PM UTC-4, Jonathan Malmaud wrote:
>
> Discourse lives! 
> On Mon, Sep 19, 2016 at 3:01 PM Stefan Karpinski <ste...@karpinski.org 
> > wrote:
>
>> I got the go ahead from Jeff and Viral to give this a try, then it didn't 
>> end up panning out. It would still be worth a try, imo.
>>
>> On Sat, Sep 17, 2016 at 11:55 AM, mmh <mum...@gmail.com > 
>> wrote:
>>
>>> Hi Jonathan,
>>>
>>> Seems like this has kind of burnt out. Is there still an impetus on a 
>>> transition. 
>>>
>>> On Saturday, September 19, 2015 at 8:16:36 PM UTC-4, Jonathan Malmaud 
>>> wrote:
>>>
>>>> Hi all,
>>>> There's been some chatter about maybe switching to a new, more modern 
>>>> forum platform for Julia that could potentially subsume julia-users, 
>>>> julia-dev, julia-stats, julia-gpu, and julia-jobs.   I created 
>>>> http://julia.malmaud.com for us to try one out and see if we like it. 
>>>> Please check it out and leave feedback. All the old posts from julia-users 
>>>> have already been imported to it.
>>>>
>>>> It is using Discourse <http://www.discourse.org/faq/>, the same forum 
>>>> software used for the forums of Rust <https://users.rust-lang.org>, 
>>>> BoingBoing, and some other big sites. Benefits over Google Groups include 
>>>> better support for topic tagging, community moderation features,  Markdown 
>>>> (and hence syntax highlighting) in messages, inline previews of linked-to 
>>>> Github issues, better mobile support, and more options for controlling 
>>>> when 
>>>> and what you get emailed. The Discourse website 
>>>> <http://www.discourse.org/faq/> does a better job of summarizing the 
>>>> advantages than I could.
>>>>
>>>> To get things started, MIke Innes suggested having a topic on what we 
>>>> plan on working on this coming wee 
>>>> <http://julia.malmaud.com/t/whats-everyone-working-on-this-week-9-19-2015-9-26/3155>k.
>>>>  
>>>> I think that's a great idea.
>>>>
>>>> Just to be clear, this isn't "official" in any sense - it's just to 
>>>> kickstart the discussion. 
>>>>
>>>> -Jon
>>>>
>>>>
>>>>
>>

[julia-users] Re: ANN: A potential new Discourse-based Julia forum

2016-09-17 Thread mmh
Hi Jonathan,

Seems like this has kind of burnt out. Is there still an impetus on a 
transition. 

On Saturday, September 19, 2015 at 8:16:36 PM UTC-4, Jonathan Malmaud wrote:
>
> Hi all,
> There's been some chatter about maybe switching to a new, more modern 
> forum platform for Julia that could potentially subsume julia-users, 
> julia-dev, julia-stats, julia-gpu, and julia-jobs.   I created 
> http://julia.malmaud.com for us to try one out and see if we like it. 
> Please check it out and leave feedback. All the old posts from julia-users 
> have already been imported to it.
>
> It is using Discourse , the same forum 
> software used for the forums of Rust , 
> BoingBoing, and some other big sites. Benefits over Google Groups include 
> better support for topic tagging, community moderation features,  Markdown 
> (and hence syntax highlighting) in messages, inline previews of linked-to 
> Github issues, better mobile support, and more options for controlling when 
> and what you get emailed. The Discourse website 
>  does a better job of summarizing the 
> advantages than I could.
>
> To get things started, MIke Innes suggested having a topic on what we 
> plan on working on this coming wee 
> k.
>  
> I think that's a great idea.
>
> Just to be clear, this isn't "official" in any sense - it's just to 
> kickstart the discussion. 
>
> -Jon
>
>
>

[julia-users] Re: Idea: Julia Standard Libraries and Distributions

2016-09-13 Thread mmh
What about sysimage for these pkgs out of base?

On Tuesday, September 13, 2016 at 4:39:15 AM UTC-4, Chris Rackauckas wrote:
>
> I think one major point of contention when talking about what should be 
> included in Base due to competing factors:
>
>
>1. Some people would like a "lean Base" for things like embedded 
>installs or other low memory applications
>2. Some people want a MATLAB-like "bells and whistles" approach. This 
>way all the functions they use are just there: no extra packages to 
>find/import.
>3. Some people like having things in Base because it "standardizes" 
>things. 
>4. Putting things in Base constrains their release schedule. 
>5. Putting things in packages outside of JuliaLang helps free up 
>Travis.
>
>
> The last two concerns have been why things like JuliaMath have sprung up 
> to move things out of Base. However, I think there is some credibility to 
> having some form of standardization. I think this can be achieved through 
> some kind of standard library. This would entail a set of packages which 
> are installed when Julia is installed, and a set of packages which add 
> their using statement to the .juliarc. To most users this would be 
> seamless: they would install automatically, and every time you open Julia, 
> they would import automatically. There are a few issues there:
>
>
>1.  This wouldn't work with building from source. This idea works 
>better for binaries (this is no biggie since these users are likely more 
>experienced anyways)
>2. Julia would have to pick winners and losers.
>
> That second part is big: what goes into the standard library? Would all of 
> the MATLAB functions like linspace, find, etc. go there? Would the sparse 
> matrix library be included?
>
> I think one way to circumvent the second issue would be to allow for Julia 
> Distributions. A distribution would be defined by:
>
>
>1. A Julia version
>2. A List of packages to install (with versions?)
>3. A build script
>4. A .juliarc
>
> The ideal would be for one to be able to make an executable from those 
> parts which would install the Julia version with the specified packages, 
> build the packages (and maybe modify some environment variables / 
> defaults), and add a .juliarc that would automatically import some packages 
> / maybe define some constants or checkout branches. JuliaLang could then 
> provide a lean distribution and a "standard distribution" where the 
> standard distribution is a more curated library which people can fight 
> about, but it's not as big of a deal if anyone can make their own. This has 
> many upsides:
>
>
>1. Julia wouldn't have to come with what you don't want.
>2. Other than some edge cases where the advantages of Base come into 
>play (I don't know of a good example, but I know there are some things 
>which can't be defined outside of Base really well, like BigFloats? I'm 
> not 
>the expert on this.), most things could spawn out to packages without the 
>standard user ever noticing.
>3. There would still be a large set of standard functions you can 
>assume most people will have.
>4. You can share Julia setups: for example, with my lab I would share 
>a distribution that would have all of the JuliaDiffEq packages installed, 
>along with Plots.jl and some backends, so that way it would be "in the box 
>solve differential equations and plot" setup like what MATLAB provides. I 
>could pick packages/versions that I know work well together, 
>and guarantee their install will work. 
>5. You could write tutorials / run workshops which use a distribution, 
>knowing that a given set of packages will be available.
>6. Anyone could make their setup match yours by looking at the 
>distribution setup scripts (maybe just make a base function which runs 
> that 
>install since it would all be in Julia). This would be nice for some work 
>in progress projects which require checking out master on 3 different 
>packages, and getting some weird branch for another 5. I would give you a 
>succinct and standardized way to specify an install to get there.
>
>
> Side notes:
>
> [An interesting distribution would be that JuliaGPU could provide a full 
> distribution for which CUDAnative works (since it requires a different 
> Julia install)]
>
> [A "Data Science Distribution" would be a cool idea: you'd never want to 
> include all of the plotting and statistical things inside of Base, but 
> someone pointing out what all of the "good" packages are that play nice 
> with each other would be very helpful.]
>
> [What if the build script could specify a library path, so that way it can 
> install a setup which doesn't interfere with a standard Julia install?]
>
> This is not without downsides. Indeed, one place where you can look is 
> Python. Python has distributions, but one problem with them is that 
> packages don't 

Re: [julia-users] code_native complex for simple snippet

2016-08-30 Thread mmh
Is there an unsafe version of >> and <<  that does not do range checks?


On Tuesday, August 30, 2016 at 6:46:36 PM UTC-4, Yichao Yu wrote:
>
>
>
> On Tue, Aug 30, 2016 at 6:25 PM, mmh <mum...@gmail.com > 
> wrote:
>
>>
>> <https://lh3.googleusercontent.com/-5n6yH4HI3KA/V8YHuRMLoiI/AC8/5zqKJIGP4wwpjw7hmKgLgTxvt3WVYKY2ACLcB/s1600/snip_20160830182427.png>
>>
>>
>>
> FWIW, your "simple" function has 10 operations that generates ~40 
> instructions when fully inlined including the range check for the shifts, 
> prologue, epilogue, struct return etc, that seems like a reasonable number 
> for me.
>
>>
>>
>>
>> <https://lh3.googleusercontent.com/-UmVvAEvHCMI/V8YGuOL_FqI/ACw/aH6yx-CJAvIHxB3EMYa1nuW8JOtbJjCrgCLcB/s1600/snip_20160830182032.png>
>>
>> <https://lh3.googleusercontent.com/-UmVvAEvHCMI/V8YGuOL_FqI/ACw/aH6yx-CJAvIHxB3EMYa1nuW8JOtbJjCrgCLcB/s1600/snip_20160830182032.png>The
>>  
>> output for the Int32 and Int64 case 
>> <https://lh3.googleusercontent.com/-UmVvAEvHCMI/V8YGuOL_FqI/ACw/aH6yx-CJAvIHxB3EMYa1nuW8JOtbJjCrgCLcB/s1600/snip_20160830182032.png>
>>
>> <https://lh3.googleusercontent.com/-UmVvAEvHCMI/V8YGuOL_FqI/ACw/aH6yx-CJAvIHxB3EMYa1nuW8JOtbJjCrgCLcB/s1600/snip_20160830182032.png>The
>>  
>> following looks particularly bad for the Int32 case  
>> <https://lh3.googleusercontent.com/-UmVvAEvHCMI/V8YGuOL_FqI/ACw/aH6yx-CJAvIHxB3EMYa1nuW8JOtbJjCrgCLcB/s1600/snip_20160830182032.png>
>>
>> <https://lh3.googleusercontent.com/-UmVvAEvHCMI/V8YGuOL_FqI/ACw/aH6yx-CJAvIHxB3EMYa1nuW8JOtbJjCrgCLcB/s1600/snip_20160830182032.png>
>>  movabsq $">>", %rax 
>> <https://lh3.googleusercontent.com/-UmVvAEvHCMI/V8YGuOL_FqI/ACw/aH6yx-CJAvIHxB3EMYa1nuW8JOtbJjCrgCLcB/s1600/snip_20160830182032.png>
>>   
>>  movl$6, %edx 
>> <https://lh3.googleusercontent.com/-UmVvAEvHCMI/V8YGuOL_FqI/ACw/aH6yx-CJAvIHxB3EMYa1nuW8JOtbJjCrgCLcB/s1600/snip_20160830182032.png>
>>   
>>  callq   *%rax 
>> <https://lh3.googleusercontent.com/-UmVvAEvHCMI/V8YGuOL_FqI/ACw/aH6yx-CJAvIHxB3EMYa1nuW8JOtbJjCrgCLcB/s1600/snip_20160830182032.png>
>>   
>>
>>
>> <https://lh3.googleusercontent.com/-PhTsfPn5Lo0/V8YHFRbgGmI/AC4/-243A3nwrWMSKPis1vkMH85RGCjMNyGAwCLcB/s1600/snip_20160830182154.png>
>>
>> <https://lh3.googleusercontent.com/-RdeaFTCAHJA/V8YG2CTJ2CI/AC0/hXguNBdd6MM7VlsvMPjX-QwjyIJv140RQCLcB/s1600/snip_20160830182101.png>
>>
>>
>

[julia-users] code_native complex for simple snippet

2016-08-30 Thread mmh








The
 
output for the Int32 and Int64 case 

The
 
following looks particularly bad for the Int32 case  


 movabsq $">>", %rax 

  
 movl$6, %edx 

  
 callq   *%rax 

  






Re: [julia-users] Re: successfully built julia using windows subsystem for linux but getting '`: not enough memory (ENOMEM) when trying to use Compat

2016-08-24 Thread mmh
Thanks, didnt' see that before.

On Wednesday, August 24, 2016 at 5:30:23 PM UTC-4, Isaiah wrote:
>
> Please see 
> https://groups.google.com/d/msg/julia-users/K3AKvD6EYT8/vGcchWRlAgAJ
>
> On Wed, Aug 24, 2016 at 5:26 PM, mmh <mum...@gmail.com > 
> wrote:
>
>>
>> versioninfo()
>> Julia Version 0.6.0-dev.390
>> Commit 3ab4d76 (2016-08-24 18:18 UTC)
>> Platform Info:
>>   System: Linux (x86_64-linux-gnu)
>>   CPU: unknown
>>   WORD_SIZE: 64
>>   BLAS: libopenblas (USE64BITINT DYNAMIC_ARCH NO_AFFINITY Haswell)
>>   LAPACK: libopenblas64_
>>   LIBM: libopenlibm
>>   LLVM: libLLVM-3.7.1 (ORCJIT, haswell)
>>
>>
>

[julia-users] Re: successfully built julia using windows subsystem for linux but getting '`: not enough memory (ENOMEM) when trying to use Compat

2016-08-24 Thread mmh

versioninfo()
Julia Version 0.6.0-dev.390
Commit 3ab4d76 (2016-08-24 18:18 UTC)
Platform Info:
  System: Linux (x86_64-linux-gnu)
  CPU: unknown
  WORD_SIZE: 64
  BLAS: libopenblas (USE64BITINT DYNAMIC_ARCH NO_AFFINITY Haswell)
  LAPACK: libopenblas64_
  LIBM: libopenlibm
  LLVM: libLLVM-3.7.1 (ORCJIT, haswell)



[julia-users] Re: successfully built julia using windows subsystem for linux but getting '`: not enough memory (ENOMEM) when trying to use Compat

2016-08-24 Thread mmh

$ julia
   _
   _   _ _(_)_ |  A fresh approach to technical computing
  (_) | (_) (_)|  Documentation: http://docs.julialang.org
   _ _   _| |_  __ _   |  Type "?help" for help.
  | | | | | | |/ _` |  |
  | | |_| | | | (_| |  |  Version 0.6.0-dev.390 (2016-08-24 18:18 UTC)
 _/ |\__'_|_|_|\__'_|  |  Commit 3ab4d76 (0 days old master)
|__/   |  x86_64-linux-gnu


$ julia
   _
   _   _ _(_)_ |  A fresh approach to technical computing
  (_) | (_) (_)|  Documentation: http://docs.julialang.org
   _ _   _| |_  __ _   |  Type "?help" for help.
  | | | | | | |/ _` |  |
  | | |_| | | | (_| |  |  Version 0.6.0-dev.390 (2016-08-24 18:18 UTC)
 _/ |\__'_|_|_|\__'_|  |  Commit 3ab4d76 (0 days old master)
|__/   |  x86_64-linux-gnu

On Wednesday, August 24, 2016 at 5:25:18 PM UTC-4, mmh wrote:
>
> successfully built julia using windows subsystem for linux but getting '`: 
> not enough memory (ENOMEM) when trying to use Compat
>
> julia> using Compat
> INFO: Precompiling module Compat...
> ERROR: could not spawn `/home/me/julia/usr/bin/julia -Cnative 
> -J/home/me/julia/usr/lib/julia/sys.so --compile=yes --depwarn=yes -O0 
> --output-ji /home/me/.julia/lib/v0.6/Compat.ji --output-incremental=yes 
> --startup-file=no --history-file=no --color=yes --eval 'while !eof(STDIN)
> eval(Main, deserialize(STDIN))
> end
> '`: not enough memory (ENOMEM)
>  in _jl_spawn(::String, ::Array{String,1}, ::Ptr{Void}, ::Base.Process, 
> ::Base.PipeEndpoint, ::Base.TTY, ::Base.TTY) at ./process.jl:319
>  in #416 at ./process.jl:468 [inlined]
>  in setup_stdio(::Base.##416#417{Cmd,Ptr{Void},Base.Process}, 
> ::Tuple{Pipe,Base.TTY,Base.TTY}) at ./process.jl:458
>  in #spawn#415(::Nullable{Base.ProcessChain}, ::Function, ::Cmd, 
> ::Tuple{Pipe,Base.TTY,Base.TTY}) at ./process.jl:467
>  in (::Base.#kw##spawn)(::Array{Any,1}, ::Base.#spawn, ::Cmd, 
> ::Tuple{Pipe,Base.TTY,Base.TTY}) at ./:0
>  in #spawn#412(::Nullable{Base.ProcessChain}, ::Function, 
> ::Base.CmdRedirect, ::Tuple{Pipe,Base.TTY,Base.TTY}) at ./process.jl:351
>  in spawn(::Base.CmdRedirect, ::Tuple{Pipe,Base.TTY,Base.TTY}) at 
> ./process.jl:351
>  in open(::Base.CmdRedirect, ::String, ::Base.TTY) at ./process.jl:529
>  in create_expr_cache(::String, ::String) at ./loading.jl:458
>  in compilecache(::String) at ./loading.jl:504
>  in require(::Symbol) at ./loading.jl:364
>


[julia-users] successfully built julia using windows subsystem for linux but getting '`: not enough memory (ENOMEM) when trying to use Compat

2016-08-24 Thread mmh
successfully built julia using windows subsystem for linux but getting '`: 
not enough memory (ENOMEM) when trying to use Compat

julia> using Compat
INFO: Precompiling module Compat...
ERROR: could not spawn `/home/me/julia/usr/bin/julia -Cnative 
-J/home/me/julia/usr/lib/julia/sys.so --compile=yes --depwarn=yes -O0 
--output-ji /home/me/.julia/lib/v0.6/Compat.ji --output-incremental=yes 
--startup-file=no --history-file=no --color=yes --eval 'while !eof(STDIN)
eval(Main, deserialize(STDIN))
end
'`: not enough memory (ENOMEM)
 in _jl_spawn(::String, ::Array{String,1}, ::Ptr{Void}, ::Base.Process, 
::Base.PipeEndpoint, ::Base.TTY, ::Base.TTY) at ./process.jl:319
 in #416 at ./process.jl:468 [inlined]
 in setup_stdio(::Base.##416#417{Cmd,Ptr{Void},Base.Process}, 
::Tuple{Pipe,Base.TTY,Base.TTY}) at ./process.jl:458
 in #spawn#415(::Nullable{Base.ProcessChain}, ::Function, ::Cmd, 
::Tuple{Pipe,Base.TTY,Base.TTY}) at ./process.jl:467
 in (::Base.#kw##spawn)(::Array{Any,1}, ::Base.#spawn, ::Cmd, 
::Tuple{Pipe,Base.TTY,Base.TTY}) at ./:0
 in #spawn#412(::Nullable{Base.ProcessChain}, ::Function, 
::Base.CmdRedirect, ::Tuple{Pipe,Base.TTY,Base.TTY}) at ./process.jl:351
 in spawn(::Base.CmdRedirect, ::Tuple{Pipe,Base.TTY,Base.TTY}) at 
./process.jl:351
 in open(::Base.CmdRedirect, ::String, ::Base.TTY) at ./process.jl:529
 in create_expr_cache(::String, ::String) at ./loading.jl:458
 in compilecache(::String) at ./loading.jl:504
 in require(::Symbol) at ./loading.jl:364


[julia-users] Re: chol() more strict in v0.5?

2016-08-04 Thread mmh
In matlab i always add a little nugget to the diagonal to stabilize the 
cholesky

On Thursday, August 4, 2016 at 10:15:49 AM UTC-4, Chris wrote:
>
> The reasoning makes sense to me, it's just a minor annoyance to have it 
> fail due to rounding errors, essentially. I think going further "upstream" 
> in my code and enforcing symmetry more explicitly will ultimately lead to 
> cleaner code anyway. Thanks.



[julia-users] Re: How to make a variable length tuple with inferred type

2016-08-02 Thread mmh
Care to explain in more depth? If the function is type stable i.e. it 
returns an Int for an Int input then why would ntuple(::Function,::Int) not 
be type stable?  What do you mean by the return type depends on the "value" 
of the integer (it's an integer!). Am I misunderstanding?



On Monday, August 1, 2016 at 11:40:25 AM UTC-4, Kristoffer Carlsson wrote:
>
> Nope. nuple(::Function, ::Int) is not type stable because the return type 
> depends on the value of the integer.
>
> On Monday, August 1, 2016 at 5:17:10 PM UTC+2, mmh wrote:
>>
>> Is this a known bug/regression?
>>
>> On Sunday, July 31, 2016 at 10:53:11 PM UTC-4, Sheehan Olver wrote:
>>>
>>> It still doesn't infer the type in 0.5:
>>>
>>> *julia> **@code_warntype ntuple( x -> 0, 3)*
>>>
>>> Variables:
>>>
>>>   #self#::Base.#ntuple
>>>
>>>   f::##5#6
>>>
>>>   n::Int64
>>>
>>>
>>> Body:
>>>
>>>   begin 
>>>
>>>   unless (Base.sle_int)(n::Int64,0)::Bool goto 3
>>>
>>>   return (Core.tuple)()::Tuple{}
>>>
>>>   3: 
>>>
>>>   unless (n::Int64 === 1)::Bool goto 6
>>>
>>>   return (Core.tuple)($(QuoteNode(0)))::Tuple{Int64}
>>>
>>>   6: 
>>>
>>>   unless (n::Int64 === 2)::Bool goto 9
>>>
>>>   return 
>>> (Core.tuple)($(QuoteNode(0)),$(QuoteNode(0)))::Tuple{Int64,Int64}
>>>
>>>   9: 
>>>
>>>   unless (n::Int64 === 3)::Bool goto 12
>>>
>>>   return 
>>> (Core.tuple)($(QuoteNode(0)),$(QuoteNode(0)),$(QuoteNode(0)))::Tuple{Int64,Int64,Int64}
>>>
>>>   12: 
>>>
>>>   unless (n::Int64 === 4)::Bool goto 15
>>>
>>>   return 
>>> (Core.tuple)($(QuoteNode(0)),$(QuoteNode(0)),$(QuoteNode(0)),$(QuoteNode(0)))::Tuple{Int64,Int64,Int64,Int64}
>>>
>>>   15: 
>>>
>>>   unless (n::Int64 === 5)::Bool goto 18
>>>
>>>   return 
>>> (Core.tuple)($(QuoteNode(0)),$(QuoteNode(0)),$(QuoteNode(0)),$(QuoteNode(0)),$(QuoteNode(0)))::Tuple{Int64,Int64,Int64,Int64,Int64}
>>>
>>>   18: 
>>>
>>>   unless (Base.slt_int)(n::Int64,16)::Bool goto 21
>>>
>>>   return (Core._apply)(Core.tuple,$(Expr(:invoke, LambdaInfo for 
>>> ntuple(::##5#6, ::Int64), :(Base.ntuple), :(f), 
>>> :((Base.box)(Int64,(Base.sub_int)(n,5),(Core.tuple)($(QuoteNode(0)),$(QuoteNode(0)),$(QuoteNode(0)),$(QuoteNode(0)),$(QuoteNode(0)))::Tuple{Int64,Int64,Int64,Int64,Int64})
>>> *::Tuple{Vararg{Any,N}}*
>>>
>>>   21: 
>>>
>>>   return $(Expr(:invoke, LambdaInfo for _ntuple(::Function, 
>>> ::Int64), :(Base._ntuple), :(f), :(n)))
>>>
>>>   end*::Tuple*
>>>
>>> On Monday, August 1, 2016 at 10:34:30 AM UTC+10, David P. Sanders wrote:
>>>>
>>>>
>>>>
>>>> El domingo, 31 de julio de 2016, 20:16:04 (UTC-4), Sheehan Olver 
>>>> escribió:
>>>>>
>>>>> I'm doing the following:
>>>>>
>>>>>
>>>>> immutable FooIterator{d} end
>>>>>
>>>>> Base.start(::FooIterator{d}) = tuple(zeros(Int,d)...)::NTuple{d,Int}
>>>>>
>>>>
>>>>
>>>> You can use the `ntuple` function, which constructs a tuple from a 
>>>> function:
>>>>
>>>> julia> ntuple( x -> 0, 3)
>>>> (0,0,0)
>>>>
>>>> julia> typeof(ans)
>>>> Tuple{Int64,Int64,Int64}
>>>>  
>>>>
>>>>>
>>>>>
>>>>> But is there a more elegant way of getting the type inferred?  I 
>>>>> suppose I can override low order d directly:
>>>>>
>>>>> Base.start(::FooIterator{2}) = (0,0)
>>>>> Base.start(::FooIterator{3}) = (0,0,0)
>>>>>
>>>>

[julia-users] Re: How to make a variable length tuple with inferred type

2016-08-01 Thread mmh
Is this a known bug/regression?

On Sunday, July 31, 2016 at 10:53:11 PM UTC-4, Sheehan Olver wrote:
>
> It still doesn't infer the type in 0.5:
>
> *julia> **@code_warntype ntuple( x -> 0, 3)*
>
> Variables:
>
>   #self#::Base.#ntuple
>
>   f::##5#6
>
>   n::Int64
>
>
> Body:
>
>   begin 
>
>   unless (Base.sle_int)(n::Int64,0)::Bool goto 3
>
>   return (Core.tuple)()::Tuple{}
>
>   3: 
>
>   unless (n::Int64 === 1)::Bool goto 6
>
>   return (Core.tuple)($(QuoteNode(0)))::Tuple{Int64}
>
>   6: 
>
>   unless (n::Int64 === 2)::Bool goto 9
>
>   return 
> (Core.tuple)($(QuoteNode(0)),$(QuoteNode(0)))::Tuple{Int64,Int64}
>
>   9: 
>
>   unless (n::Int64 === 3)::Bool goto 12
>
>   return 
> (Core.tuple)($(QuoteNode(0)),$(QuoteNode(0)),$(QuoteNode(0)))::Tuple{Int64,Int64,Int64}
>
>   12: 
>
>   unless (n::Int64 === 4)::Bool goto 15
>
>   return 
> (Core.tuple)($(QuoteNode(0)),$(QuoteNode(0)),$(QuoteNode(0)),$(QuoteNode(0)))::Tuple{Int64,Int64,Int64,Int64}
>
>   15: 
>
>   unless (n::Int64 === 5)::Bool goto 18
>
>   return 
> (Core.tuple)($(QuoteNode(0)),$(QuoteNode(0)),$(QuoteNode(0)),$(QuoteNode(0)),$(QuoteNode(0)))::Tuple{Int64,Int64,Int64,Int64,Int64}
>
>   18: 
>
>   unless (Base.slt_int)(n::Int64,16)::Bool goto 21
>
>   return (Core._apply)(Core.tuple,$(Expr(:invoke, LambdaInfo for 
> ntuple(::##5#6, ::Int64), :(Base.ntuple), :(f), 
> :((Base.box)(Int64,(Base.sub_int)(n,5),(Core.tuple)($(QuoteNode(0)),$(QuoteNode(0)),$(QuoteNode(0)),$(QuoteNode(0)),$(QuoteNode(0)))::Tuple{Int64,Int64,Int64,Int64,Int64})
> *::Tuple{Vararg{Any,N}}*
>
>   21: 
>
>   return $(Expr(:invoke, LambdaInfo for _ntuple(::Function, ::Int64), 
> :(Base._ntuple), :(f), :(n)))
>
>   end*::Tuple*
>
> On Monday, August 1, 2016 at 10:34:30 AM UTC+10, David P. Sanders wrote:
>>
>>
>>
>> El domingo, 31 de julio de 2016, 20:16:04 (UTC-4), Sheehan Olver escribió:
>>>
>>> I'm doing the following:
>>>
>>>
>>> immutable FooIterator{d} end
>>>
>>> Base.start(::FooIterator{d}) = tuple(zeros(Int,d)...)::NTuple{d,Int}
>>>
>>
>>
>> You can use the `ntuple` function, which constructs a tuple from a 
>> function:
>>
>> julia> ntuple( x -> 0, 3)
>> (0,0,0)
>>
>> julia> typeof(ans)
>> Tuple{Int64,Int64,Int64}
>>  
>>
>>>
>>>
>>> But is there a more elegant way of getting the type inferred?  I suppose 
>>> I can override low order d directly:
>>>
>>> Base.start(::FooIterator{2}) = (0,0)
>>> Base.start(::FooIterator{3}) = (0,0,0)
>>>
>>

[julia-users] Re: Dear Sublime Text Users

2016-07-09 Thread mmh
Thanks you! I keep trying Atom (but it's painfully slow) and on many 
occasions am forced to use it due to its better Julia support. This is a 
very welcomed package. How does reverse lookup work? I selected some text 
and then entered reverse look up but nothing happened? Also what kind of 
font are you using that you get colored emojis? Thanks  

On Saturday, July 9, 2016 at 3:42:13 AM UTC-4, Randy Lai wrote:
>
> Hi all Sublime Text uers,
>
>
> I have just created a tiny package Julia-Unicode 
>  to help inserting LaTeX and 
> Unicode in Sublime Text. It uses the Julia mappings so you don't have to 
> worry about
> the different betwwen ɛ (\varepsilon) and ϵ (\epsilon).
> You may have heard of Julia Completions 
>  or UnicodeMath 
> , I promise you that 
> Julia-Unicode works better than them :).
>
> Check it out from Package Control.
>
> Feedbacks appreciated.
>


[julia-users] Re: Master list of constants

2016-07-05 Thread mmh
On v0.5 this gives:

ERROR: UndefVarError: @MIME not defined
 in eval(::Module, ::Any) at .\boot.jl:234
 in macro expansion; at .\REPL[4]:2 [inlined]
 in anonymous at .\:?
 in eval(::Module, ::Any) at .\boot.jl:234
 in macro expansion at .\REPL.jl:92 [inlined]
 in (::Base.REPL.##1#2{Base.REPL.REPLBackend})() at .\event.jl:46


On Tuesday, July 5, 2016 at 7:23:47 AM UTC-4, Lyndon White wrote:
>
> You can use the `names` function to search a module for all constants it 
> defines.
> By evaluating the names to get the values, then checking if they are 
> Irrational (formerly known as MathConst?)
> By doing this, I conclude that the  ASCII name for the Euler-Mascheroni 
> constant (γ) is eulergamma
>
>
>  
>
>
> julia> for name_sym in names(Base)
>   value = eval(name_sym)
>   if typeof(value)<: Irrational
> println(name_sym, "\t", value)
>   end
>end
>
>
> catalan catalan = 0.9159655941772...
> e   e = 2.7182818284590...
> eu  e = 2.7182818284590...
> eulergamma  γ = 0.5772156649015...
> golden  φ = 1.6180339887498...
> pi  π = 3.1415926535897...
> γ   γ = 0.5772156649015...
> π   π = 3.1415926535897...
> φ   φ = 1.6180339887498...
>
>
>
>

Re: [julia-users] Re: JuliaCon schedule announced

2016-06-28 Thread mmh
Hi Viral, we have an eta on when the talks will be up on youtube?

On Wednesday, June 22, 2016 at 11:13:25 AM UTC-4, Viral Shah wrote:
>
> Live streaming was too expensive and we did not do it this year, but we 
> certainly want to next year.
>
> -viral
> On Jun 22, 2016 10:33 AM, "Gabriel Gellner"  > wrote:
>
>> For future conferences I would be super stoked to pay some fee to have 
>> early access if that would help at all. Super stoked to see so many of 
>> these sweet talks!
>>
>> On Wednesday, June 22, 2016 at 6:49:43 AM UTC-7, Viral Shah wrote:
>>>
>>> Yes they will be and hopefully much sooner than last year.
>>>
>>> -viral
>>> On Jun 22, 2016 7:31 AM, "nuffe"  wrote:
>>>
 Will all the talks be posted on youtube, like last year? If so, do you 
 know when? Thank you (overseas enthusiast) 

 On Thursday, June 9, 2016 at 11:34:18 PM UTC+2, Viral Shah wrote:
>
> The JuliaCon talks and workshop schedule has now been announced.
>
> http://juliacon.org/schedule.html
>
> Please buy your tickets if you have been procrastinating. We have seen 
> tickets going much faster this year, and waiting until the day before is 
> unlikely to work this year. Please also spread the message to your 
> friends 
> and colleagues and relevant mailing lists. Here's the conference poster 
> for 
> emailing and printing:
>
> http://juliacon.org/pdf/juliacon2016poster3.pdf
>
> -viral
>