I have dealt with the pain of annoying solver interfaces in the past and 
just seeing all this come together so cleanly and effortlessly for the user 
is a huge differentiator for Julia.

Amazing work Madeleine, and the whole JuliaOpt team. I continue to cheer 
from the sidelines.

Perhaps some blog posts on the whole solver interoperability would be great 
to have.

-viral

On Thursday, February 5, 2015 at 8:42:33 AM UTC+5:30, Miles Lubin wrote:
>
> I'm personally very pleased to see this. The JuMP team has worked closely 
> with the Convex.jl team to make sure that we share a common infrastructure 
> (through MathProgBase) to talk to solvers, and I don't think it's an 
> exaggeration to say that this has resulted in an unprecedented level of 
> solver interoperability. At this point I'm hard pressed to think of another 
> platform besides Julia which lets you easily switch between AMPL/GAMS-style 
> modeling (via JuMP) and DCP-style modeling (via Convex.jl) as you might 
> experiment with different formulations of a problem.
>
> On Wednesday, February 4, 2015 at 8:57:51 PM UTC-5, Elliot Saba wrote:
>>
>> This is so so cool, Madeleine.  Thank you for sharing.  I'm a huge fan of 
>> DCP, ever since I took a convex optimization course here at the UW (which 
>> of course featured cvx and Boyd's book) and seeing this in Julia makes me 
>> smile.
>> -E
>>
>> On Wed, Feb 4, 2015 at 5:53 PM, Madeleine Udell <madelei...@gmail.com> 
>> wrote:
>>
>>> Convex.jl <https://github.com/JuliaOpt/Convex.jl> is a Julia library 
>>> for mathematical programming that makes it easy to formulate and fast to 
>>> solve nonlinear convex optimization problems. Convex.jl 
>>> <https://github.com/JuliaOpt/Convex.jl> is a member of the JuliaOpt 
>>> <https://github.com/JuliaOpt> umbrella group and can use (nearly) any 
>>> solver that complies with the MathProgBase interface, including Mosek 
>>> <https://github.com/JuliaOpt/Mosek.jl>, Gurobi 
>>> <https://github.com/JuliaOpt/gurobi.jl>, ECOS 
>>> <https://github.com/JuliaOpt/ECOS.jl>, SCS 
>>> <https://github.com/JuliaOpt/SCS.jl>, and GLPK 
>>> <https://github.com/JuliaOpt/GLPK.jl>.
>>>
>>> Here's a quick example of code that solves a non-negative least-squares 
>>> problem.
>>>
>>> using Convex
>>>
>>> # Generate random problem data
>>> m = 4;  n = 5
>>> A = randn(m, n); b = randn(m, 1)
>>>
>>> # Create a (column vector) variable of size n x 1.
>>> x = Variable(n)
>>>
>>> # The problem is to minimize ||Ax - b||^2 subject to x >= 0
>>> problem = minimize(sum_squares(A * x + b), [x >= 0])
>>>
>>> solve!(problem)
>>>
>>> We could instead solve a robust approximation problem by replacing 
>>> sum_squares(A 
>>> * x + b) by sum(norm(A * x + b, 1)) or sum(huber(A * x + b)); it's that 
>>> easy.
>>>
>>> Convex.jl <https://github.com/JuliaOpt/Convex.jl> is different from JuMP 
>>> <https://github.com/JuliaOpt/JuMP.jl> in that it allows (and 
>>> prioritizes) linear algebraic and functional constructions in objectives 
>>> and constraints (like max(x,y) < A*z). Under the hood, it converts 
>>> problems to a standard conic form, which requires (and certifies) that the 
>>> problem is convex, and guarantees global optimality of the resulting 
>>> solution.
>>>
>>
>>

Reply via email to