[julia-users] Re: ANN: JuMP 0.10 released

2015-09-01 Thread Joey Huchette

On Tuesday, September 1, 2015 at 1:27:25 PM UTC-4, Fabrizio Lacalandra 
wrote:
>
> V great job! Shall i suggest to start thinking at:
>
> 1) @removeconstraint
>

Should be pretty easy to add a function that does this in just a few lines 
of code.
 

> 2) @defvar with indication on priority branching for integer vars, 
> accepted by Cplex/Gurobi at least
>

It's not quite at the JuMP level, but you can set the branching priority 
[for 
CPLEX](https://github.com/JuliaOpt/CPLEX.jl/blob/25ebbf1c8444c961045b9b15b1a225432eadb811/src/cpx_solve.jl#L15-L37).
 

> 3) Some kind of special Constraint Programming-like construct, such as 
> classic allDiff and more advanced things?
>

This is definitely on the agenda! It will probably be most natural as a 
separate package, but this seems like a natural choice for the next "big" 
project.
 

> 4) Start a discussion on how modularity in the problem construction can be 
> enhanced (not sure in which direction)
>

We'd definitely appreciate any user feedback on how this should work.
 

>
> BTW do we have a wish list of JuMP somewhere ?
>

No official list, but feel free to open issues for feature requests on the 
Github page.
 

>
> Thanks
> Fabrizio 
>
> On Tuesday, September 1, 2015 at 6:41:21 AM UTC+2, Miles Lubin wrote:
>>
>> The JuMP team is happy to announce the release of JuMP 0.10.
>>
>> This is a major release with the greatest amount of new functionality 
>> since the addition of nonlinear modeling last year. This will likely be the 
>> last major release of JuMP to support Julia 0.3. Thanks to the heroic work 
>> of Joey Huchette, JuMP now supports *vectorized syntax* and modeling for 
>> *semidefinite 
>> programming*.
>>
>> You can now write, for example:
>>
>> @defVar(m, x[1:5])
>> @addConstraint(m, A*x .== 0)
>>
>> where A is a Julia matrix (dense or sparse). Note that we require dot 
>> comparison operators .== (and similarly .<= and .>=) for vectorized 
>> constraints. The vectorized syntax extends to quadratic but not general 
>> nonlinear expressions.
>>
>> An important new concept to keep in mind is that this vectorized syntax 
>> only applies to sets of variables which are one-based arrays. If you 
>> declare variables indexed by more complicated sets, e.g.,
>>
>> @defVar(m, y[3:5])
>> s = [:cat, :dog, :pizza]
>> @defVar(m, z[s])
>>
>> then dot(y,z) and rand(3,3)*z are undefined. A result of this new 
>> concept of one-based arrays is that x above now has the type 
>> Vector{JuMP.Variable}. In this case, getValue() now returns a 
>> Vector{Float64} instead of an opaque JuMP object. We hope users find 
>> this new distinction between one-indexed array variables and all other 
>> symbolically indexed variables useful and intuitive (if not, let us know).
>>
>> For semidefinite modeling, you can declare variables as SDP matrices and 
>> add LMI (linear matrix inequality) constraints as illustrated in the 
>> examples for minimal ellipse 
>> <https://github.com/JuliaOpt/JuMP.jl/blob/6e7c86acfe09c4970741d957e381446bfd7630ca/examples/minellipse.jl>
>>  and 
>> max cut 
>> <https://github.com/JuliaOpt/JuMP.jl/blob/6e7c86acfe09c4970741d957e381446bfd7630ca/examples/maxcut_sdp.jl>,
>>  
>> among others.
>>
>> We also have a *new syntax for euclidean norms:*
>>
>> @addConstraint(m, norm2{c[i]*x[i]+b[i],i=1:N} <= 10)
>> # or
>> @addConstraint(m, norm(c.*x+b) <= 10)
>>
>> You may be wondering how JuMP compares with Convex.jl given these new 
>> additions. Not much has changed philosophically; JuMP directly translates 
>> SDP constraints and euclidean norms into the sparse matrix formats as 
>> required by conic solvers. Unlike Convex.jl, *JuMP accepts only 
>> standard-form SDP and second-order conic constraints and will not perform 
>> any automatic transformations* such as modeling nuclear norms, minimum 
>> eigenvalue, geometric mean, rational norms, etc. We would recommend using 
>> Convex.jl for easy modeling of such functions. Our focus, for now, is on 
>> the large-scale performance and stability of the huge amount of new syntax 
>> introduced in this release.
>>
>> Also notable in this release:
>> - JuMP models now store a dictionary of attached variables, so that you 
>> can look up a variable from a model by name by using the new getVar() 
>> method.
>> - On Julia 0.4 only, you can now have a filter variable declarations, 
>> e.g.,
>> @defVar(m, x[i=1:5,j=1:5; i+j >= 3])
>> will only create variables for the indices which satisfy

[julia-users] Re: Errors while trying to use JuMP

2015-06-12 Thread Joey Huchette


What’s the output of Pkg.status()? It seems like an issue with Docile, not 
JuMP.

On Friday, June 12, 2015 at 9:25:50 AM UTC-4, Kostas Tavlaridis-Gyparakis 
wrote:

 Hello,
> I am really new to Julia and not the best programmer mysefl.
> I run linux mint 16 and I installed julia via terminal (sudo apt-get 
> install julia ).
> Then when I run Julia on my terminal, while the JuMP package is installed
> sth that I believe the following two commands on my terminal prove:
>
> julia> Pkg.update()
> INFO: Updating METADATA...
> INFO: Computing changes...
> INFO: No packages to install, update or remove
>
> julia> Pkg.add("JuMP")
> INFO: Nothing to be done
>
> Then when I try to use JuMP I get the following error msg:
>
> julia> using JuMP
> ERROR: Block not defined
> in include at ./boot.jl:245
> in include_from_node1 at ./loading.jl:128
> in include at ./boot.jl:245
> in include_from_node1 at ./loading.jl:128
> in reload_path at loading.jl:152
> in _require at loading.jl:67
> in require at loading.jl:54
> in include at ./boot.jl:245
> in include_from_node1 at ./loading.jl:128
> in reload_path at loading.jl:152
> in _require at loading.jl:67
> in require at loading.jl:54
> in include at ./boot.jl:245
> in include_from_node1 at ./loading.jl:128
> in reload_path at loading.jl:152
> in _require at loading.jl:67
> in require at loading.jl:54
> in include at ./boot.jl:245
> in include_from_node1 at ./loading.jl:128
> in include at ./boot.jl:245
> in include_from_node1 at ./loading.jl:128
> in reload_path at loading.jl:152
> in _require at loading.jl:67
> in require at loading.jl:54
> in include at ./boot.jl:245
> in include_from_node1 at ./loading.jl:128
> in reload_path at loading.jl:152
> in _require at loading.jl:67
> in require at loading.jl:51
> while loading /home/kostas/.julia/v0.3/Docile/src/types.jl, in expression 
> starting on line 2
> while loading /home/kostas/.julia/v0.3/Docile/src/Docile.jl, in expression 
> starting on line 19
> while loading 
> /home/kostas/.julia/v0.3/DataStructures/src/DataStructures.jl, in 
> expression starting on line 48
> while loading /home/kostas/.julia/v0.3/Graphs/src/Graphs.jl, in expression 
> starting on line 2
> while loading /home/kostas/.julia/v0.3/ReverseDiffSparse/src/coloring.jl, 
> in expression starting on line 1
> while loading 
> /home/kostas/.julia/v0.3/ReverseDiffSparse/src/ReverseDiffSparse.jl, in 
> expression starting on line 20
> while loading /home/kostas/.julia/v0.3/JuMP/src/JuMP.jl, in expression 
> starting on line 13
>
> Finally the msg I receive for Block is the following:
>
> julia> Pkg.checkout("Block")
> ERROR: Block is not a git repo
> in checkout at pkg/entry.jl:190
> in anonymous at pkg/dir.jl:28
> in cd at ./file.jl:20
> in cd at pkg/dir.jl:28
> in checkout at pkg.jl:33
>  
​


[julia-users] Re: using variables from the workspace to define JuMP problem

2015-05-04 Thread Joey Huchette


Yes:

@defVar(m, x[ i = 1 : length(z) ] >= z[i] )

This error message is really opaque, though; I’ve updated it on JuMP master 
to be a little more informative.

-Joey

On Monday, May 4, 2015 at 6:23:45 PM UTC-4, Alexandros Fakos wrote:

Hi,
>
> z is a variable in the workspace  
>
> Ideally I would like to define variable x for the optimization problem 
> like:
> @defVar(m, x[ 1 : length(z) ] >= z )
> which gives the error
> ERROR: `Variable` has no method matching Variable(::Model, 
> ::Array{Float64,1}, ::Float64, ::Symbol, ::ASCIIString, ::Float64)
>  in anonymous at julia/v0.3/JuMP/src/macros.jl:81
>
> Is there any way to do it?
>
> Thanks,
> Alex
>
> ​


[julia-users] Re: how to display the whole model

2015-04-24 Thread Joey Huchette


print(mod)

On Friday, April 24, 2015 at 2:02:59 PM UTC-4, Michela Di Lullo wrote:

Hello everyone,
>
> I'm new to julia and I was wondering how to display the whole model. 
> I tried with: 
>
> mod=Model(...)
> ...
>
> display(mod)
>
>
> Feasibility problem with:
>
>  * 144 linear constraints
>
>  * 2822 variables: 1514 binary
>
> Solver set to Gurobi
>
>
> but it only says the *number* of variables and constraints in the model.. 
> while I want to see the constraints/objective in its/their expanded forms.  
>
>
> Any idea about how to make it? 
>
> Thank you all for any suggestion :)
>
​


[julia-users] Re: Beginner Issues

2015-03-08 Thread Joey Huchette


This code works for me in Julia v0.3.5 and v0.4. Can you provide the output 
of versioninfo() and Pkg.status()?

Also, there is a julia-opt 
 mailing list that is 
the best venue to ask questions specific to optimization in Julia.

On Sunday, March 8, 2015 at 3:08:32 PM UTC-4, Lauren Clisby wrote:


>
> Just started using Julia, JuMP, Cbc, GLPK, and Gurobi this week for a 
> class.  Trying to solve a basic LP and compare Cbc, GLPK, and Gurobi for 
> speed (as a class project).
>
> I've had a lot of issues trying to learn these from scratch (I've coded in 
> VBA, HTML, and Java, and use MATLAB regularly) so please help me out with 
> these beginner questions.
>
> I've loaded the packages without errors, it's just when I try to solve an 
> LP that it breaks.
> Right now I'm just trying to run a basic model and get this error:
> *ERROR: stack overflow in Model at none:2 (repeats 63846 times)*
>
> Here's the code I'm trying to run, which is basically straight from an 
> online example:
>
>
> *using JuMP*
>
> *function Model2()*
>
> *m = Model()*
>
> *@defVar(m, x[1:5], Bin)*
>
> *profit = [ 5, 3, 2, 7, 4 ]*
>
> *weight = [ 2, 8, 4, 2, 5 ]*
>
> *capacity = 10*
>
> *# Objective: maximize profit*
>
> *@setObjective(m, Max, dot(profit, x))*
>
> *# Constraint: can carry all*
>
> *@addConstraint(m, dot(weight, x) <= capacity)*
>
> *# Solve problem using MIP solver*
>
> *status = solve(m)*
>
> *println("Objective is: ", getObjectiveValue(m))*
>
> *println("Solution is:")*
>
> *for i = 1:5*
>
> *print("x[$i] = ", getValue(x[i]))*
>
> *println(", p[$i]/w[$i] = ", profit[i]/weight[i])*
>
> *end*
>
> *end*
>
> *Model2()*
>
> I'm using it as a function because I'm using the command window version 
> and otherwise it runs every line separately and breaks. 
>
> Thanks!
>
​


[julia-users] Re: MIQP in Julia Gurobi interface

2015-01-17 Thread Joey Huchette


Hey Micah,
This is definitely possible, provided your quadratic constraints are convex 
or second-order cone representable. See this section 
 of 
the readme for how to set up a QP, and this section 
 for how 
to specify integer variables. Also check out our modeling package JuMP 
 for an algebraic modeling language 
that I’d recommend using over the lower-level Gurobi interface directly.

Also, there’s a julia-opt mailing list 
 we’ve set up for these 
types of questions related to optimization in Julia.

On Saturday, January 17, 2015 at 6:19:59 PM UTC-5, Micah McClimans wrote:

I'm trying to figure out how to set up a Mixed Integer Quadratic Program in 
> Gurobi through Julia, and I'm not seeing how to do it. I'm not seeing 
> anything in the source code that even suggests that it can be done, but 
> since I can't see anything in the Gurobi docs explaining how to do it 
> either, I could very well be missing something important. I'd like to avoid 
> actually modifying the Gurobi package, since I could easily screw something 
> up, but if MIQP can't be done in Gurobi/Julia, I suppose I would.
>
​


[julia-users] Re: equivalent of IDL's where?

2015-01-08 Thread Joey Huchette


On Thursday, January 8, 2015 at 8:31:07 PM UTC-5, tugul…@gmail.com wrote:

absolute newb here, with no formal coding education (science). Came here 
> from the Fortran and IDL territory.
>
> I have two quick questions, and I apologize beforehand if these have been 
> discussed before:
>
> (1) is there an equivalent of IDL function "where" in julia? a function 
> that returns indices of the array elements that satisfy the specified 
> condition(s), i.e. [1,2,3]=where(a < 5), with a=[4,4,4,5,5,5]
>
find(a .< 5)


> (2) can I declare a multidimentional array in a single line? in 1D I see 
> that I can do following:
>  array1D = [i^2 for i=1:10]
> how would I do something like:
>  array2D = [i+j for i=1:10, for j=1:10]
>
array2D = [i+j for i=1:10, j=1:10]


> thanks,
> Tuguldur
>
​


Re: [julia-users] Re: home page content

2014-12-09 Thread Joey Huchette
I think the [Rust website](http://www.rust-lang.org/) is pretty fantastic, 
in terms of both design and content. Having the code examples runnable and 
editable (via JuliaBox) would be a killer feature, though I have no idea 
how feasible that is.

On Tuesday, December 9, 2014 6:54:33 PM UTC-5, Elliot Saba wrote:
>
> Perhaps not now, but as a long-term goal, having a live, editable widget 
> of code on the homepage is such an awesome draw-in, IMO.
> -E
>
> On Tue, Dec 9, 2014 at 3:43 PM, Leah Hanson  > wrote:
>
>> Seeing code examples of a type and a couple of functions that use it 
>> would probably give a good idea of what the code looks like. The JuMP seems 
>> exciting enough to highlight both as a package and a use of macros.
>>
>> I don't know if you want to encourage different styles, but seeing 
>> examples of Python like, c like, and functional-ish ways of writing Julia 
>> would be a way to show off the variety of things you can do.
>>
>> --Leah
>>
>> On Wed, Dec 10, 2014 at 8:50 AM Elliot Saba > > wrote:
>>
>>> We're having intermittent DNS issues.  http://julialang.org is now up 
>>> for me however, and I can dig it: (I couldn't, previously)
>>>
>>> $ dig julialang.org
>>>
>>> ; <<>> DiG 9.8.3-P1 <<>> julialang.org
>>> ;; global options: +cmd
>>> ;; Got answer:
>>> ;; ->>HEADER<<- opcode: QUERY, status: NOERROR, id: 56740
>>> ;; flags: qr rd ra; QUERY: 1, ANSWER: 3, AUTHORITY: 4, ADDITIONAL: 4
>>>
>>> ;; QUESTION SECTION:
>>> ;julialang.org. IN  A
>>>
>>> ;; ANSWER SECTION:
>>> julialang.org.  2202IN  CNAME   julialang.github.io.
>>> julialang.github.io.2202IN  CNAME   github.map.fastly.net.
>>> github.map.fastly.net.  15  IN  A   199.27.79.133
>>> ...
>>> -E
>>>
>>>
>>>
>>> On Tue, Dec 9, 2014 at 2:46 PM, > wrote:
>>>


 On Wednesday, December 10, 2014 8:23:26 AM UTC+10, Stefan Karpinski 
 wrote:
>
> We're looking to redesign the JuliaLang.org home page and try to give 
> it a little more focus than it currently has. Which raises the question 
> of 
> what to focus on. We could certainly have better code examples and maybe 
> highlight features of the language and its ecosystem better. What do 
> people 
> think we should include?
>

 The whole site seems to be offline?  Is that because of this? 

>>>
>>>  
>

Re: [julia-users] Re: PSA: Choosing between Julia 0.3 vs Julia 0.4

2014-09-26 Thread Joey Huchette

>
> I followed your advice and, unintendedly being transferred to 0.4, managed 
> to 
> move back to 0.3.1 -- at the price that some packages (like Cbc) don't 
> build 
> anymore at the moment. Still I think it is unfortunate to keep interested 
> and 
> adventurous users away from a development version.
>

Please do open an issue (https://github.com/JuliaOpt/Cbc.jl/issues/new) if 
you have build troubles, especially on the latest official release!


Re: [julia-users] Optional import mechanism

2014-08-19 Thread Joey Huchette


It’s a bit ugly, but this should work:

try
eval(Expr(:import, :ImageView))
global view = ImageView.view
catch err
@show err
# fallback to nothing
global view = view(args...; kargs...) = (nothing, nothing)
end

On Tuesday, August 19, 2014 9:00:39 PM UTC-4, Júlio Hoffimann wrote:

Hi Kevin,
>
> What happens if you use import instead is require?
>>
>
> ERROR: error compiling anonymous: unsupported or misplaced expression 
> import in function anonymous
>
> Júlio.
>
>>  ​


[julia-users] Re: uncurried functions are ok... if you know what you are doing

2014-06-18 Thread Joey Huchette
See here (https://github.com/JuliaLang/julia/issues/554) for discussion on 
currying in Julia.

On Wednesday, June 18, 2014 6:01:38 PM UTC-5, Peter Simon wrote:
>
> How about
>
> pairs = [e for e in enumerate(sqr)] 
> filter(p -> +(p...)%4 != 0, pairs)
>
> ?
>
> --Peter
>
> On Wednesday, June 18, 2014 3:34:24 PM UTC-7, gentlebeldin wrote:
>>
>> When you come across Julia, and know (a bit of) Haskell, comparisons are 
>> inevitable. Haskell has only functions with one argument, but the function 
>> value may be another function. It takes some time to wrap your brain around 
>> that idea, but hey, it works.
>> Another type of functions with just one argument, possibly a tuple, may 
>> work as well, I guess, and that seemed to be the choice for Julia: 
>> uncurried.
>> So let's define a function taking a pair: f=(x,y)->x*y, and define a 
>> pair: p=(2,3), and feed it to our function: f(p)
>> ERROR: wrong number of arguments
>> Oops... 
>> Suppose we have an array, sqr=[i^2 for i=1:5], and want to filter it, but 
>> depending on the index, too. There's a function for that, enumerate. Let's 
>> assume we want only those values where the index plus value (square) isn't 
>> divisible by 4. Check enumerate: [e for e in enumerate(sqr)]:
>> 5-element Array{(Int64,Any),1}:
>>  (1,1) 
>>  (2,4) 
>>  (3,9) 
>>  (4,16)
>>  (5,25)
>> Ok, those are pairs, so let's filter:
>> filter((x,y)->(x+y)%4!=0,[e for e in enumerate(sqr)])
>> You guessed it, same "oops" as above.
>> Well, you can work around that mess, naturally, but looking at the 
>> candidates...
>> filter(p->(p[1]+p[2])%4!=0,[e for e in enumerate(sqr)])
>> filter(p->(first(p)+last(p))%4!=0,[e for e in enumerate(sqr)])
>> filter(p->((x,y)->(x+y)%4!=0)(p...),[e for e in enumerate(sqr)])
>> All of them work, but what would be your favorite obfuscation of a simple 
>> thought? The last example suggests that the whole mess may be collateral 
>> damage from supporting varargs, but... is it worth that?
>> Don't get me wrong, I like Julia, there's a lot of nice and powerful 
>> concepts, like multipe dispatch, but why does she have to have those warts 
>> right on her nose? ;-)
>>
>>

[julia-users] Re: ANN: LinearMaps

2014-06-15 Thread Joey Huchette
This looks very, very cool.

On Sunday, June 15, 2014 12:08:52 PM UTC-5, Jutho wrote:
>
> Dear Julia users,
>
> I have just added a new Julia package for working with linear maps, also 
> known as linear transformations or linear operators (although the latter 
> term restricts to the slightly more restricted case where domain and 
> codomain are the same, i.e. number of rows and columns of the matrix 
> representation are the same). The idea is that a LinearMap object is like 
> an AbstractMatrix in that it transforms vectors into vectors by 
> multiplication, but that it does not need to be represented as a Matrix.
>
> The package provides functionality for wrapping linear functions without 
> explicit matrix representation (e.g. cumsum, fft, diff, ... ) as LinearMap 
> objects, for combining LinearMap objects via composition or linear 
> combination, and for taking transposes. All of these operations are 
> evaluated 'lazily', i.e. only when the combined object is multiplied with a 
> vector will the operations be performed, and in such a way that only matrix 
> vector multiplications and vector additions need to be computed. In 
> addition, these operations are implemented  with efficiency in mind. They 
> try to minimise the number of temporaries (i.e. minimal memory allocation) 
> and make available mutating operations such as A_mul_B!, even when this was 
> not provided by e.g. the function that performs the matrix vector product. 
> In addition, they can be combined with normal AbstractMatrix objects and 
> with the built in identity I (UniformScaling). 
>
> The application of this package is in first place in the context of 
> iterative algorithms for linear systems or eigenvalue problems, which 
> should be implemented using duck typing of the matrix/operator argument. In 
> particular, it works well in combination with eigs, Julia's built in 
> wrapper for Arpack. Like AbstractMatrix objects, LinearMap objects respond 
> to the methods size, isreal, eltype, issym, ishermitian and isposdef, where 
> it is even attempted to cleverly guess the correct values for these latter 
> properties (i.e. A'*B*B'*A will be recognised as a positive definite 
> hermitian linear map).
>
> Check it out at https://github.com/Jutho/LinearMaps.jl . Comments and 
> suggestions are appreciated.
>
> Jutho
>


[julia-users] Re: Scoping issues with nested function definition

2014-06-14 Thread Joey Huchette
Ahh ok, thanks for the tips. I should have known I was trying to be a bit 
too tricky...

On Friday, June 13, 2014 8:27:28 PM UTC-5, Steven G. Johnson wrote:
>
> Yes, eval defines things in the top-level scope, so you normally don't use 
> it inside functions.
>
> However, cfunction inherently does not work with anonymous functions, 
> essentially because C functions can't represent closures.
>
> To pass closures (anonymous/inner functions) to C, you'll want to use a 
> different technique.   See:
>
>http://julialang.org/blog/2013/05/callback/
>
> for an overview.
>


[julia-users] Scoping issues with nested function definition

2014-06-13 Thread Joey Huchette
Consider:

julia> function foo()
   f() = nothing
   target = :g
   name = :f
   @eval $target = cfunction($name, Nothing, ())
   end
foo (generic function with 1 method)


julia> foo()
ERROR: f not defined
 in foo at none:5


and

julia> f() = nothing
f (generic function with 1 method)


julia> target = :g
:g


julia> name = :f
:f


julia> @eval $target = cfunction($name, Nothing, ())
Ptr{Void} @0x000113b4ba50


I suspect this is because eval is trying to define f at top-level, but is 
there any way to define g in the first example so that it can be used in 
foo?


Re: [julia-users] Re: how to apply vector on diagonal of matrix ?

2014-05-20 Thread Joey Huchette
Perhaps a setdiag!(b,a) function would be handy.

On Tuesday, May 20, 2014 10:46:28 PM UTC-4, gael@gmail.com wrote:
>
> While I cannot not agree with this ;), I'd like to state that:
> 1) High level functions might leverage clever algorithms faster than plain 
> loops (best example comming to mind: dot).
> 2) Vectorized code is always easier to understand, write, fix and maintain 
> because the intent is clear from the start. You equation is written just as 
> it was on paper and not burried within nested loops among many explicit 
> indices.
> Moreover, Julia will get better at devectorizing and at avoiding 
> temporaries.
>
> Therefore I would recommand using explicit loops only when *proved* to 
> provide a necessary speedup or a memory gain.
>
> In this case diagm is working just as intended and saving 20ns on matrix 
> construction just seem silly. I perfectly understand your point though but 
> explicit loops have downsides too.
>
> Profile first, optimize later... Comment now!  ;)
>
>

[julia-users] Re: JuMP @addConstraint Sum over list of tuples

2014-05-10 Thread Joey Huchette
Also note that by placing @defVar in a loop, you are overwriting the Julia 
variable x in every pass through the loop. New variables will be added to 
the JuMP model with every call to @defVar, but the only one accessible to 
the user will be the last one created (in this case, x[3,4]). I can't seem 
to replicate your error exactly, though.

On Saturday, May 10, 2014 12:08:33 PM UTC-4, Joey Huchette wrote:
>
> Hi Andrew,
> There are a couple of ways to make this work. The first is to construct x 
> such that it is indexed by the tuples (1,2) and (3,4):
>
> @defVar(m, x[con_values] >= 0)
> @addConstraint(m, t == sum{param[i,j]*x[(i,j)], (i,j)=con_values})
>
> The second is to unpack the tuples and construct x indexed on each 
> coordinate:
>
> idx = map(z->z[1], con_values)
> idy = map(z->z[2], con_values)
> @defVar(m, x[idx,idy] >= 0)
> @addConstraint(m, t == sum{param[i,j]*x[i,j], (i,j)=con_values})
>
> You have hit up against one of the limitations of JuMP variables as 
> currently implemented, in that the index set for x must be a cartesian 
> product of iterables (which does not hold in this particular case). We hope 
> to add this "feature" in the future, but it's a pretty nontrivial change.
>
> Btw in the future questions about JuMP (or the other JuliaOpt packages) 
> might be better suited on the JuliaOpt mailing list[1] or as an issue on 
> Github [2], rather than the julia-users list.
>
> [1] https://groups.google.com/forum/#!forum/julia-opt 
> [2] https://github.com/JuliaOpt/JuMP.jl/issues?state=open
>
> On Saturday, May 10, 2014 8:44:35 AM UTC-4, Andrew B. Martin wrote:
>>
>> Hello,
>>
>> I'm trying to do the following:
>>
>> @defConstraint(m, t == sum{param[i,j]*x[i,j], (i,j)=con_values}
>>
>> When I test it in the interpreter I get:
>>
>> *ERROR: no method 
>> addToExpression(GenericAffExpr{Float64,Variable},Float64,Array{Int64,1})*
>>
>> with con_values defined as [(1,2), (3,4)] and x defined as: 
>>
>> for (i,j)=con_values
>>
>> @defVar(m, x[i,j] >=0)
>>
>> end
>>
>> The docs show using the sum operator with single single value arrays. 
>> From the code illustrating what sum does it seems like it shouldn't be a 
>> stretch to do what I have above; I even use the list of tuples in the for 
>> loop to define the variables.
>>
>> How can I achieve a constraint definition as I have it above?
>>
>>
>>
>>

[julia-users] Re: JuMP @addConstraint Sum over list of tuples

2014-05-10 Thread Joey Huchette
Hi Andrew,
There are a couple of ways to make this work. The first is to construct x 
such that it is indexed by the tuples (1,2) and (3,4):

@defVar(m, x[con_values] >= 0)
@addConstraint(m, t == sum{param[i,j]*x[(i,j)], (i,j)=con_values})

The second is to unpack the tuples and construct x indexed on each 
coordinate:

idx = map(z->z[1], con_values)
idy = map(z->z[2], con_values)
@defVar(m, x[idx,idy] >= 0)
@addConstraint(m, t == sum{param[i,j]*x[i,j], (i,j)=con_values})

You have hit up against one of the limitations of JuMP variables as 
currently implemented, in that the index set for x must be a cartesian 
product of iterables (which does not hold in this particular case). We hope 
to add this "feature" in the future, but it's a pretty nontrivial change.

Btw in the future questions about JuMP (or the other JuliaOpt packages) 
might be better suited on the JuliaOpt mailing list[1] or as an issue on 
Github [2], rather than the julia-users list.

[1] https://groups.google.com/forum/#!forum/julia-opt 
[2] https://github.com/JuliaOpt/JuMP.jl/issues?state=open

On Saturday, May 10, 2014 8:44:35 AM UTC-4, Andrew B. Martin wrote:
>
> Hello,
>
> I'm trying to do the following:
>
> @defConstraint(m, t == sum{param[i,j]*x[i,j], (i,j)=con_values}
>
> When I test it in the interpreter I get:
>
> *ERROR: no method 
> addToExpression(GenericAffExpr{Float64,Variable},Float64,Array{Int64,1})*
>
> with con_values defined as [(1,2), (3,4)] and x defined as: 
>
> for (i,j)=con_values
>
> @defVar(m, x[i,j] >=0)
>
> end
>
> The docs show using the sum operator with single single value arrays. From 
> the code illustrating what sum does it seems like it shouldn't be a stretch 
> to do what I have above; I even use the list of tuples in the for loop to 
> define the variables.
>
> How can I achieve a constraint definition as I have it above?
>
>
>
>

Re: [julia-users] Complex Schur decomposition

2014-04-16 Thread Joey Huchette
MATLAB has a rsf2csf function that converts real Schur to complex Schur. 
Maybe an analogue in Julia is what I'm thinking about (and should alleviate 
the performance hit inside Lapack with the complex conversion, at least).

On Thursday, April 17, 2014 2:18:28 AM UTC-4, Joey Huchette wrote:
>
> A note in the docs seems like a reasonable solution, although I'm not sure 
> I agree that it's a completely transparent design decision. I've always 
> seen the real Schur form presented as an (important) computational trick to 
> avoid complex arithmetic, while the complex form is the mental model. I 
> have a feeling that many users may have never seen the real form before (as 
> a dumb little metric, neither Wikipedia nor Mathworld seem to mention it). 
> To bikeshed for a minute, calling an explicit conversion seems a little 
> bolted on for getting a complex Schur decomposition from a real matrix, 
> which seems like a common use case (at least for the MATLAB-style homework 
> questions like the one I was working on a few hours ago...). Just my two 
> cents.
>
> On Thursday, April 17, 2014 1:15:10 AM UTC-4, Andreas Noack Jensen wrote:
>>
>> I prefer the present solution where we use dispatch instead of keywords. 
>> I think it is transparent what is happening: real matrix -> real schur 
>> form, complex matrix -> complex schur form. However, we could add a line in 
>> the documentation explaining it. If you are okay with that idea, please 
>> open a pr. 
>>
>>
>> 2014-04-17 6:58 GMT+02:00 Joey Huchette :
>>
>>> Ahh simple enough, thanks. It might be nice to have a keyword argument 
>>> that just dispatches on this---it's not completely obvious that's the right 
>>> thing to do unless you dig around lapack.jl (or get a hint). I can PR.
>>>
>>>
>>> On Thursday, April 17, 2014 12:18:00 AM UTC-4, Andreas Noack Jensen 
>>> wrote:
>>>
>>>> The trick is to convert your matrix to complex before the calculation.
>>>>
>>>> schurfact(complex(A))[:T]
>>>>
>>>> gives the triangular part.
>>>>
>>>>
>>>> 2014-04-17 1:18 GMT+02:00 Joey Huchette :
>>>>
>>>> Is there an implementation in Base (or elsewhere) of a Schur 
>>>>> decomposition that returns a complex matrix matrix T that is triangular? 
>>>>> For reference, MATLAB has a optional switch between the two forms. I 
>>>>> didn't 
>>>>> do enough digging to see if this option is exposed by Lapack, so maybe 
>>>>> the 
>>>>> conversion could be done at the Julia level? Apologies if I overlooked it 
>>>>> in the docs/source.
>>>>>
>>>>
>>>>
>>>>
>>>> -- 
>>>> Med venlig hilsen
>>>>
>>>> Andreas Noack Jensen
>>>>  
>>>
>>
>>
>> -- 
>> Med venlig hilsen
>>
>> Andreas Noack Jensen
>>  
>

Re: [julia-users] Complex Schur decomposition

2014-04-16 Thread Joey Huchette
A note in the docs seems like a reasonable solution, although I'm not sure 
I agree that it's a completely transparent design decision. I've always 
seen the real Schur form presented as an (important) computational trick to 
avoid complex arithmetic, while the complex form is the mental model. I 
have a feeling that many users may have never seen the real form before (as 
a dumb little metric, neither Wikipedia nor Mathworld seem to mention it). 
To bikeshed for a minute, calling an explicit conversion seems a little 
bolted on for getting a complex Schur decomposition from a real matrix, 
which seems like a common use case (at least for the MATLAB-style homework 
questions like the one I was working on a few hours ago...). Just my two 
cents.

On Thursday, April 17, 2014 1:15:10 AM UTC-4, Andreas Noack Jensen wrote:
>
> I prefer the present solution where we use dispatch instead of keywords. I 
> think it is transparent what is happening: real matrix -> real schur form, 
> complex matrix -> complex schur form. However, we could add a line in the 
> documentation explaining it. If you are okay with that idea, please open a 
> pr. 
>
>
> 2014-04-17 6:58 GMT+02:00 Joey Huchette >
> :
>
>> Ahh simple enough, thanks. It might be nice to have a keyword argument 
>> that just dispatches on this---it's not completely obvious that's the right 
>> thing to do unless you dig around lapack.jl (or get a hint). I can PR.
>>
>>
>> On Thursday, April 17, 2014 12:18:00 AM UTC-4, Andreas Noack Jensen wrote:
>>
>>> The trick is to convert your matrix to complex before the calculation.
>>>
>>> schurfact(complex(A))[:T]
>>>
>>> gives the triangular part.
>>>
>>>
>>> 2014-04-17 1:18 GMT+02:00 Joey Huchette :
>>>
>>> Is there an implementation in Base (or elsewhere) of a Schur 
>>>> decomposition that returns a complex matrix matrix T that is triangular? 
>>>> For reference, MATLAB has a optional switch between the two forms. I 
>>>> didn't 
>>>> do enough digging to see if this option is exposed by Lapack, so maybe the 
>>>> conversion could be done at the Julia level? Apologies if I overlooked it 
>>>> in the docs/source.
>>>>
>>>
>>>
>>>
>>> -- 
>>> Med venlig hilsen
>>>
>>> Andreas Noack Jensen
>>>  
>>
>
>
> -- 
> Med venlig hilsen
>
> Andreas Noack Jensen
>  


Re: [julia-users] Complex Schur decomposition

2014-04-16 Thread Joey Huchette
Ahh simple enough, thanks. It might be nice to have a keyword argument that 
just dispatches on this---it's not completely obvious that's the right 
thing to do unless you dig around lapack.jl (or get a hint). I can PR.

On Thursday, April 17, 2014 12:18:00 AM UTC-4, Andreas Noack Jensen wrote:
>
> The trick is to convert your matrix to complex before the calculation.
>
> schurfact(complex(A))[:T]
>
> gives the triangular part.
>
>
> 2014-04-17 1:18 GMT+02:00 Joey Huchette >
> :
>
>> Is there an implementation in Base (or elsewhere) of a Schur 
>> decomposition that returns a complex matrix matrix T that is triangular? 
>> For reference, MATLAB has a optional switch between the two forms. I didn't 
>> do enough digging to see if this option is exposed by Lapack, so maybe the 
>> conversion could be done at the Julia level? Apologies if I overlooked it 
>> in the docs/source.
>>
>
>
>
> -- 
> Med venlig hilsen
>
> Andreas Noack Jensen
>  


Re: [julia-users] Performance expectations and benchmarks

2014-04-16 Thread Joey Huchette
Have you run the benchmarks on newer versions of Julia? Would be 
interesting to see the delta.

On Wednesday, April 16, 2014 11:15:17 PM UTC-4, Iain Dunning wrote:
>
> Some more involved benchmarks here:
> http://arxiv.org/pdf/1312.1431v1.pdf
> achieve similar performance to the front-page benchmarks - see section 4.
>
> On Wednesday, April 16, 2014 10:12:51 PM UTC-4, Jameson wrote:
>>
>> There are very few types in Julia that are not custom. 
>>
>> Custom Types include: Strings, Numbers, Dictionary, BitArrays (arrays 
>> of bits packed into words), SharedArrays (using mmap), Ranges (eg. 
>> iteration) 
>>
>> Builtin Types include: DataType, UnionType, AbstractType, Tuple, Arrays 
>>
>>
>>
>> On Wed, Apr 16, 2014 at 9:54 PM, Gilberto Noronha  
>> wrote: 
>> > Well, I would like to see something with custom types and more 
>> complexity. 
>> > What works on minimal examples does not necessarily scale (I've seen 
>> some 
>> > very fast but very horrible to maintain Fortran code, so I should 
>> know). 
>> > C++ deals very well with complexity (if you know what you are doing). 
>> Java 
>> > does it too. Fortran does not, So I was curious about how Julia 
>> performs 
>> > with larger/more complex programs. 
>> > 
>> > Gilberto 
>> > 
>> > 
>> > On Wednesday, April 16, 2014 9:41:42 PM UTC-4, Kevin Squire wrote: 
>> >> 
>> >> While getting performance from Julia isn't always straightforward (see 
>> >> http://julia.readthedocs.org/en/latest/manual/performance-tips/), I'm 
>> >> curious why you find the benchmarks misleading?  Is it because they're 
>> >> focused on minimal examples, or maybe because they don't include some 
>> of 
>> >> your use cases?  Often when someone has brought this up before, they 
>> had 
>> >> particular performance problems, some of which led or pointed to the 
>> >> performance tips above, others which were turned into performance 
>> tests (see 
>> >> http://speed.julialang.org/). 
>> >> 
>> >> Anyway, hope the above helps. If you have suggestions for more 
>> realistic 
>> >> benchmarks, please make a proposal! 
>> >> 
>> >> Cheers, Kevin 
>> >> 
>> >> On Wednesday, April 16, 2014, Gilberto Noronha  
>> >> wrote: 
>> >>> 
>> >>> Hi all, 
>> >>> 
>> >>> I was looking at some benchmark results (mainly the ones on 
>> >>> julialang.org) and I could not resist the thought that they are a 
>> bit 
>> >>> misleading. I say this as a Julia fan (I wish I did not think this 
>> way!) 
>> >>> because realistically I don't think one can expect a real world Julia 
>> >>> application to be even close (performance-wise)  to a highly 
>> optimized C, 
>> >>> C++ or Fortran program (the website suggests that the performance 
>> ratio is 
>> >>> almost one to one). 
>> >>> 
>> >>> In my (highly subjective) experience it is possible to hope for a 
>> Julia 
>> >>> program to be competitive with Java and Go (which is already a very 
>> good 
>> >>> thing, when Python and Matlab can be hundreds of times slower!). 
>> >>> But I am curious about other's experiences and whether I could find 
>> "more 
>> >>> realistic" benchmarks. In particular, I am curious about benchmarks 
>> >>> involving custom objects/classes. My gut feeling is that Julia is not 
>> as 
>> >>> good with those, but I believe it still beats Matlab very easily 
>> (because, 
>> >>> among other things, Matlab classes seem to be horribly slow). 
>> >>> 
>> >>> Any thoughts? 
>> >>> 
>> >>> Gilberto 
>>
>

[julia-users] Complex Schur decomposition

2014-04-16 Thread Joey Huchette
Is there an implementation in Base (or elsewhere) of a Schur decomposition 
that returns a complex matrix matrix T that is triangular? For reference, 
MATLAB has a optional switch between the two forms. I didn't do enough 
digging to see if this option is exposed by Lapack, so maybe the conversion 
could be done at the Julia level? Apologies if I overlooked it in the 
docs/source.


Re: [julia-users] Re: How to use GLPK.exact ?

2014-04-10 Thread Joey Huchette
Either approach should work in principle (pycddlib vs. cddlib), but I 
suspect that the Python wrapper is higher-level, and will be easier to use 
from Julia. For reference, here's a snippet of how you might calculate the 
extreme points as rationals, calling pycddlib from Julia (adapted from code 
from Miles): https://gist.github.com/joehuchette/10396014 

A computational geometry package in Julia would be a godsend for me (right 
now I'm working in Perl which is...not ideal), but I agree that it would be 
quite a bit of work. Certainly on my radar, though.


On Thursday, April 10, 2014 5:49:16 AM UTC-4, Stéphane Laurent wrote:
>
> Again, thank you for all these answers. Sorry Carlo, I missed the double 
> slash in your previous answer. 
>
> It would be a good opportunity for me to call Python in order to train my 
> skills in Python in addition to Julia. But why do you suggest me to call 
> pycddlib with PyCall rather than calling cddlib with ccall ? 
>


[julia-users] Re: Seeing Segmentation Fault

2014-02-25 Thread Joey Huchette
Looks superficially similar 
to https://github.com/JuliaLang/julia/issues/5106

On Tuesday, February 25, 2014 6:02:55 PM UTC-5, Michael Schnall-Levin wrote:
>
> Hello,
>
> I'm seeing a segmentation fault while running julia code.  I got it to 
> reproduce multiple times on a small script that I'm running.
>
> The only error message I got was below.
>
> I'm running julia 0.3.0-prerelease+1564 on Ubuntu 12.04.
>
> Any help is very appreciated.  
>
> Thanks, Mike
>
>
> -- Errror Messsage 
>
> fatal: error thrown and no exception handler available.
>
> Base.MethodError(f=Base.Enumerate{I}, 
> args=((
>
> :
>
> :
>
> :
> .. (this continues for more lines) ..
>
> : fault (core dumped)
>


Re: [julia-users] Can't manage packages

2014-01-30 Thread Joey Huchette
FYI we managed to get around this today by just deleting Stats and 
METADATA, so maybe try that before wiping all of ~/.julia

On Wednesday, January 29, 2014 9:30:45 PM UTC-5, John Myles White wrote:
>
> Ok. You'll unfortunately have to either (1) delete your ~/.julia folder or 
> (2) manually rename the Stats package to StatsBase and then edit its 
> .git/config file.
>
>  -- John
>
> On Jan 29, 2014, at 6:21 PM, Carlos Lesmes > 
> wrote:
>
>
> I got
>
> Pkg.rm("Stats")
>
>  
>
> ERROR: failed process: Process(`git 
> --git-dir=/Users/carloslesmes/.julia/.cache/Stats merge-base 
> 0efba512a2bf8faa21e61c9568222ae1ae96acbb 
> 5113ce6044fc554b350ea16f92502f8d6e077a62`, ProcessExited(1)) [1] 
>
>  in pipeline_error at process.jl:476 
>
>  in readbytes at process.jl:430 
>
>  in readall at process.jl:437 
>
>  in readchomp at git.jl:26 
>
>  in installed_version at pkg/read.jl:70 
>
>  in installed at pkg/read.jl:121 
>
>  in resolve at pkg/entry.jl:316 
>
>  in edit at pkg/entry.jl:24 
>
>  in rm at pkg/entry.jl:51 
>
>  in anonymous at pkg/dir.jl:25 
>
>  in cd at file.jl:22 
>
>  in cd at pkg/dir.jl:25 
>
>  in rm at pkg.jl:18
>
>
>
> On Tuesday, January 28, 2014 9:56:16 PM UTC-5, John Myles White wrote:
>>
>> Try doing Pkg.rm(“Stats”).
>>
>>  — John
>>
>> On Jan 28, 2014, at 6:47 PM, Carlos Lesmes  wrote:
>>
>> Hi,
>> I'm on mac 10.7 Julia 0.2.0, today I updated but found this:
>> julia> Pkg.update()
>> INFO: Updating METADATA...
>> INFO: Updating cache of Stats...
>> INFO: Updating cache of StatsBase...
>> INFO: Updating cache of Distance...
>> INFO: Updating cache of JSON...
>> INFO: Updating cache of PyPlot...
>> INFO: Updating cache of NumericExtensions...
>> ERROR: failed process: Process(`git 
>> --git-dir=/Users/carloslesmes/.julia/.cache/Stats merge-base 
>> 0efba512a2bf8faa21e61c9568222ae1ae96acbb 
>> 5113ce6044fc554b350ea16f92502f8d6e077a62`, ProcessExited(1)) [1]
>>  in pipeline_error at process.jl:476
>>  in readbytes at process.jl:430
>>  in readall at process.jl:437
>>  in readchomp at git.jl:26
>>  in installed_version at pkg/read.jl:70
>>  in installed at pkg/read.jl:121
>>  in update at pkg/entry.jl:231
>>  in anonymous at pkg/dir.jl:25
>>  in cd at file.jl:22
>>  in cd at pkg/dir.jl:25
>>  in update at pkg.jl:40
>> anybody knows what's wrong? Please help.
>>
>>
>>
>