On Tuesday, September 1, 2015 at 1:27:25 PM UTC-4, Fabrizio Lacalandra 
wrote:
>
> V great job! Shall i suggest to start thinking at:
>
> 1) @removeconstraint
>

Should be pretty easy to add a function that does this in just a few lines 
of code.
 

> 2) @defvar with indication on priority branching for integer vars, 
> accepted by Cplex/Gurobi at least
>

It's not quite at the JuMP level, but you can set the branching priority 
[for 
CPLEX](https://github.com/JuliaOpt/CPLEX.jl/blob/25ebbf1c8444c961045b9b15b1a225432eadb811/src/cpx_solve.jl#L15-L37).
 

> 3) Some kind of special Constraint Programming-like construct, such as 
> classic allDiff and more advanced things?
>

This is definitely on the agenda! It will probably be most natural as a 
separate package, but this seems like a natural choice for the next "big" 
project.
 

> 4) Start a discussion on how modularity in the problem construction can be 
> enhanced (not sure in which direction)
>

We'd definitely appreciate any user feedback on how this should work.
 

>
> BTW do we have a wish list of JuMP somewhere ?
>

No official list, but feel free to open issues for feature requests on the 
Github page.
 

>
> Thanks
> Fabrizio 
>
> On Tuesday, September 1, 2015 at 6:41:21 AM UTC+2, Miles Lubin wrote:
>>
>> The JuMP team is happy to announce the release of JuMP 0.10.
>>
>> This is a major release with the greatest amount of new functionality 
>> since the addition of nonlinear modeling last year. This will likely be the 
>> last major release of JuMP to support Julia 0.3. Thanks to the heroic work 
>> of Joey Huchette, JuMP now supports *vectorized syntax* and modeling for 
>> *semidefinite 
>> programming*.
>>
>> You can now write, for example:
>>
>> @defVar(m, x[1:5])
>> @addConstraint(m, A*x .== 0)
>>
>> where A is a Julia matrix (dense or sparse). Note that we require dot 
>> comparison operators .== (and similarly .<= and .>=) for vectorized 
>> constraints. The vectorized syntax extends to quadratic but not general 
>> nonlinear expressions.
>>
>> An important new concept to keep in mind is that this vectorized syntax 
>> only applies to sets of variables which are one-based arrays. If you 
>> declare variables indexed by more complicated sets, e.g.,
>>
>> @defVar(m, y[3:5])
>> s = [:cat, :dog, :pizza]
>> @defVar(m, z[s])
>>
>> then dot(y,z) and rand(3,3)*z are undefined. A result of this new 
>> concept of one-based arrays is that x above now has the type 
>> Vector{JuMP.Variable}. In this case, getValue() now returns a 
>> Vector{Float64} instead of an opaque JuMP object. We hope users find 
>> this new distinction between one-indexed array variables and all other 
>> symbolically indexed variables useful and intuitive (if not, let us know).
>>
>> For semidefinite modeling, you can declare variables as SDP matrices and 
>> add LMI (linear matrix inequality) constraints as illustrated in the 
>> examples for minimal ellipse 
>> <https://github.com/JuliaOpt/JuMP.jl/blob/6e7c86acfe09c4970741d957e381446bfd7630ca/examples/minellipse.jl>
>>  and 
>> max cut 
>> <https://github.com/JuliaOpt/JuMP.jl/blob/6e7c86acfe09c4970741d957e381446bfd7630ca/examples/maxcut_sdp.jl>,
>>  
>> among others.
>>
>> We also have a *new syntax for euclidean norms:*
>>
>> @addConstraint(m, norm2{c[i]*x[i]+b[i],i=1:N} <= 10)
>> # or
>> @addConstraint(m, norm(c.*x+b) <= 10)
>>
>> You may be wondering how JuMP compares with Convex.jl given these new 
>> additions. Not much has changed philosophically; JuMP directly translates 
>> SDP constraints and euclidean norms into the sparse matrix formats as 
>> required by conic solvers. Unlike Convex.jl, *JuMP accepts only 
>> standard-form SDP and second-order conic constraints and will not perform 
>> any automatic transformations* such as modeling nuclear norms, minimum 
>> eigenvalue, geometric mean, rational norms, etc. We would recommend using 
>> Convex.jl for easy modeling of such functions. Our focus, for now, is on 
>> the large-scale performance and stability of the huge amount of new syntax 
>> introduced in this release.
>>
>> Also notable in this release:
>> - JuMP models now store a dictionary of attached variables, so that you 
>> can look up a variable from a model by name by using the new getVar() 
>> method.
>> - On Julia 0.4 only, you can now have a filter variable declarations, 
>> e.g.,
>> @defVar(m, x[i=1:5,j=1:5; i+j >= 3])
>> will only create variables for the indices which satisfy the filter 
>> condition. (These are not one-based arrays as introduced above.)
>> - Dual multipliers are available for nonlinear problems from the solvers 
>> which provide them
>> - There is improved documentation for querying derivatives from a 
>> nonlinear JuMP model 
>> <http://jump.readthedocs.org/en/latest/nlp.html#querying-derivatives-from-a-jump-model>
>> - *We now try to print warnings for two common performance traps*: 
>> calling getValue() in a tight loop and using operator overloading to 
>> construct large JuMP expressions. Please let us know if these are useful or 
>> annoying or both so that we can tune the warning thresholds.
>> - Thanks to Tony Kelman and Jack Dunn, you can now call a large number of 
>> external solvers including Bonmin and Couenne through either the .osil or 
>> .nl exchange formats.
>> - Module precompilation speeds up using JuMP considerably, for those on 
>> Julia 0.4
>>
>> The delay since the last release of JuMP is mostly due to us trying to 
>> test and refine the new syntax, but inevitably some bugs have slipped 
>> through, so please let us know of any incorrect or confusing behavior.
>>
>> Also newsworthy is our new paper <http://arxiv.org/abs/1508.01982> 
>> describing the methods used in JuMP with benchmark comparisons to existing 
>> open-source and commercial optimization modeling software.
>>
>> Miles, Iain, and Joey
>>
>>

Reply via email to