Re: [julia-users] Announcing Convex.jl - Convex Optimization in Julia

2015-02-04 Thread Yee Sian Ng

>
> Perhaps some blog posts on the whole solver interoperability would be 
> great to have.


Miles did a notebook for a class 

 
some weeks back. If anyone else has written something about it (that they 
will like to share), feel free to contribute to our (growing) repository 
 of notebooks! (:

On Wednesday, 4 February 2015 23:06:44 UTC-5, Viral Shah wrote:
>
> I have dealt with the pain of annoying solver interfaces in the past and 
> just seeing all this come together so cleanly and effortlessly for the user 
> is a huge differentiator for Julia.
>
> Amazing work Madeleine, and the whole JuliaOpt team. I continue to cheer 
> from the sidelines.
>
> Perhaps some blog posts on the whole solver interoperability would be 
> great to have.
>
> -viral
>
> On Thursday, February 5, 2015 at 8:42:33 AM UTC+5:30, Miles Lubin wrote:
>>
>> I'm personally very pleased to see this. The JuMP team has worked closely 
>> with the Convex.jl team to make sure that we share a common infrastructure 
>> (through MathProgBase) to talk to solvers, and I don't think it's an 
>> exaggeration to say that this has resulted in an unprecedented level of 
>> solver interoperability. At this point I'm hard pressed to think of another 
>> platform besides Julia which lets you easily switch between AMPL/GAMS-style 
>> modeling (via JuMP) and DCP-style modeling (via Convex.jl) as you might 
>> experiment with different formulations of a problem.
>>
>> On Wednesday, February 4, 2015 at 8:57:51 PM UTC-5, Elliot Saba wrote:
>>>
>>> This is so so cool, Madeleine.  Thank you for sharing.  I'm a huge fan 
>>> of DCP, ever since I took a convex optimization course here at the UW 
>>> (which of course featured cvx and Boyd's book) and seeing this in Julia 
>>> makes me smile.
>>> -E
>>>
>>> On Wed, Feb 4, 2015 at 5:53 PM, Madeleine Udell  
>>> wrote:
>>>
 Convex.jl  is a Julia library 
 for mathematical programming that makes it easy to formulate and fast to 
 solve nonlinear convex optimization problems. Convex.jl 
  is a member of the JuliaOpt 
  umbrella group and can use (nearly) any 
 solver that complies with the MathProgBase interface, including Mosek 
 , Gurobi 
 , ECOS 
 , SCS 
 , and GLPK 
 .

 Here's a quick example of code that solves a non-negative least-squares 
 problem.

 using Convex

 # Generate random problem data
 m = 4;  n = 5
 A = randn(m, n); b = randn(m, 1)

 # Create a (column vector) variable of size n x 1.
 x = Variable(n)

 # The problem is to minimize ||Ax - b||^2 subject to x >= 0
 problem = minimize(sum_squares(A * x + b), [x >= 0])

 solve!(problem)

 We could instead solve a robust approximation problem by replacing 
 sum_squares(A 
 * x + b) by sum(norm(A * x + b, 1)) or sum(huber(A * x + b)); it's 
 that easy.

 Convex.jl  is different from 
 JuMP  in that it allows (and 
 prioritizes) linear algebraic and functional constructions in objectives 
 and constraints (like max(x,y) < A*z). Under the hood, it converts 
 problems to a standard conic form, which requires (and certifies) that the 
 problem is convex, and guarantees global optimality of the resulting 
 solution.

>>>
>>>

Re: [julia-users] Announcing Convex.jl - Convex Optimization in Julia

2015-02-04 Thread Viral Shah
I was pleasantly surprised to find this:

http://faculty.bscb.cornell.edu/~bien/convexjulia.html

-viral

On Thursday, February 5, 2015 at 9:36:44 AM UTC+5:30, Viral Shah wrote:
>
> I have dealt with the pain of annoying solver interfaces in the past and 
> just seeing all this come together so cleanly and effortlessly for the user 
> is a huge differentiator for Julia.
>
> Amazing work Madeleine, and the whole JuliaOpt team. I continue to cheer 
> from the sidelines.
>
> Perhaps some blog posts on the whole solver interoperability would be 
> great to have.
>
> -viral
>
> On Thursday, February 5, 2015 at 8:42:33 AM UTC+5:30, Miles Lubin wrote:
>>
>> I'm personally very pleased to see this. The JuMP team has worked closely 
>> with the Convex.jl team to make sure that we share a common infrastructure 
>> (through MathProgBase) to talk to solvers, and I don't think it's an 
>> exaggeration to say that this has resulted in an unprecedented level of 
>> solver interoperability. At this point I'm hard pressed to think of another 
>> platform besides Julia which lets you easily switch between AMPL/GAMS-style 
>> modeling (via JuMP) and DCP-style modeling (via Convex.jl) as you might 
>> experiment with different formulations of a problem.
>>
>> On Wednesday, February 4, 2015 at 8:57:51 PM UTC-5, Elliot Saba wrote:
>>>
>>> This is so so cool, Madeleine.  Thank you for sharing.  I'm a huge fan 
>>> of DCP, ever since I took a convex optimization course here at the UW 
>>> (which of course featured cvx and Boyd's book) and seeing this in Julia 
>>> makes me smile.
>>> -E
>>>
>>> On Wed, Feb 4, 2015 at 5:53 PM, Madeleine Udell  
>>> wrote:
>>>
 Convex.jl  is a Julia library 
 for mathematical programming that makes it easy to formulate and fast to 
 solve nonlinear convex optimization problems. Convex.jl 
  is a member of the JuliaOpt 
  umbrella group and can use (nearly) any 
 solver that complies with the MathProgBase interface, including Mosek 
 , Gurobi 
 , ECOS 
 , SCS 
 , and GLPK 
 .

 Here's a quick example of code that solves a non-negative least-squares 
 problem.

 using Convex

 # Generate random problem data
 m = 4;  n = 5
 A = randn(m, n); b = randn(m, 1)

 # Create a (column vector) variable of size n x 1.
 x = Variable(n)

 # The problem is to minimize ||Ax - b||^2 subject to x >= 0
 problem = minimize(sum_squares(A * x + b), [x >= 0])

 solve!(problem)

 We could instead solve a robust approximation problem by replacing 
 sum_squares(A 
 * x + b) by sum(norm(A * x + b, 1)) or sum(huber(A * x + b)); it's 
 that easy.

 Convex.jl  is different from 
 JuMP  in that it allows (and 
 prioritizes) linear algebraic and functional constructions in objectives 
 and constraints (like max(x,y) < A*z). Under the hood, it converts 
 problems to a standard conic form, which requires (and certifies) that the 
 problem is convex, and guarantees global optimality of the resulting 
 solution.

>>>
>>>

Re: [julia-users] Announcing Convex.jl - Convex Optimization in Julia

2015-02-04 Thread Viral Shah
I have dealt with the pain of annoying solver interfaces in the past and 
just seeing all this come together so cleanly and effortlessly for the user 
is a huge differentiator for Julia.

Amazing work Madeleine, and the whole JuliaOpt team. I continue to cheer 
from the sidelines.

Perhaps some blog posts on the whole solver interoperability would be great 
to have.

-viral

On Thursday, February 5, 2015 at 8:42:33 AM UTC+5:30, Miles Lubin wrote:
>
> I'm personally very pleased to see this. The JuMP team has worked closely 
> with the Convex.jl team to make sure that we share a common infrastructure 
> (through MathProgBase) to talk to solvers, and I don't think it's an 
> exaggeration to say that this has resulted in an unprecedented level of 
> solver interoperability. At this point I'm hard pressed to think of another 
> platform besides Julia which lets you easily switch between AMPL/GAMS-style 
> modeling (via JuMP) and DCP-style modeling (via Convex.jl) as you might 
> experiment with different formulations of a problem.
>
> On Wednesday, February 4, 2015 at 8:57:51 PM UTC-5, Elliot Saba wrote:
>>
>> This is so so cool, Madeleine.  Thank you for sharing.  I'm a huge fan of 
>> DCP, ever since I took a convex optimization course here at the UW (which 
>> of course featured cvx and Boyd's book) and seeing this in Julia makes me 
>> smile.
>> -E
>>
>> On Wed, Feb 4, 2015 at 5:53 PM, Madeleine Udell  
>> wrote:
>>
>>> Convex.jl  is a Julia library 
>>> for mathematical programming that makes it easy to formulate and fast to 
>>> solve nonlinear convex optimization problems. Convex.jl 
>>>  is a member of the JuliaOpt 
>>>  umbrella group and can use (nearly) any 
>>> solver that complies with the MathProgBase interface, including Mosek 
>>> , Gurobi 
>>> , ECOS 
>>> , SCS 
>>> , and GLPK 
>>> .
>>>
>>> Here's a quick example of code that solves a non-negative least-squares 
>>> problem.
>>>
>>> using Convex
>>>
>>> # Generate random problem data
>>> m = 4;  n = 5
>>> A = randn(m, n); b = randn(m, 1)
>>>
>>> # Create a (column vector) variable of size n x 1.
>>> x = Variable(n)
>>>
>>> # The problem is to minimize ||Ax - b||^2 subject to x >= 0
>>> problem = minimize(sum_squares(A * x + b), [x >= 0])
>>>
>>> solve!(problem)
>>>
>>> We could instead solve a robust approximation problem by replacing 
>>> sum_squares(A 
>>> * x + b) by sum(norm(A * x + b, 1)) or sum(huber(A * x + b)); it's that 
>>> easy.
>>>
>>> Convex.jl  is different from JuMP 
>>>  in that it allows (and 
>>> prioritizes) linear algebraic and functional constructions in objectives 
>>> and constraints (like max(x,y) < A*z). Under the hood, it converts 
>>> problems to a standard conic form, which requires (and certifies) that the 
>>> problem is convex, and guarantees global optimality of the resulting 
>>> solution.
>>>
>>
>>

Re: [julia-users] Announcing Convex.jl - Convex Optimization in Julia

2015-02-04 Thread Miles Lubin
I'm personally very pleased to see this. The JuMP team has worked closely 
with the Convex.jl team to make sure that we share a common infrastructure 
(through MathProgBase) to talk to solvers, and I don't think it's an 
exaggeration to say that this has resulted in an unprecedented level of 
solver interoperability. At this point I'm hard pressed to think of another 
platform besides Julia which lets you easily switch between AMPL/GAMS-style 
modeling (via JuMP) and DCP-style modeling (via Convex.jl) as you might 
experiment with different formulations of a problem.

On Wednesday, February 4, 2015 at 8:57:51 PM UTC-5, Elliot Saba wrote:
>
> This is so so cool, Madeleine.  Thank you for sharing.  I'm a huge fan of 
> DCP, ever since I took a convex optimization course here at the UW (which 
> of course featured cvx and Boyd's book) and seeing this in Julia makes me 
> smile.
> -E
>
> On Wed, Feb 4, 2015 at 5:53 PM, Madeleine Udell  > wrote:
>
>> Convex.jl  is a Julia library for 
>> mathematical programming that makes it easy to formulate and fast to solve 
>> nonlinear convex optimization problems. Convex.jl 
>>  is a member of the JuliaOpt 
>>  umbrella group and can use (nearly) any 
>> solver that complies with the MathProgBase interface, including Mosek 
>> , Gurobi 
>> , ECOS 
>> , SCS 
>> , and GLPK 
>> .
>>
>> Here's a quick example of code that solves a non-negative least-squares 
>> problem.
>>
>> using Convex
>>
>> # Generate random problem data
>> m = 4;  n = 5
>> A = randn(m, n); b = randn(m, 1)
>>
>> # Create a (column vector) variable of size n x 1.
>> x = Variable(n)
>>
>> # The problem is to minimize ||Ax - b||^2 subject to x >= 0
>> problem = minimize(sum_squares(A * x + b), [x >= 0])
>>
>> solve!(problem)
>>
>> We could instead solve a robust approximation problem by replacing 
>> sum_squares(A 
>> * x + b) by sum(norm(A * x + b, 1)) or sum(huber(A * x + b)); it's that 
>> easy.
>>
>> Convex.jl  is different from JuMP 
>>  in that it allows (and 
>> prioritizes) linear algebraic and functional constructions in objectives 
>> and constraints (like max(x,y) < A*z). Under the hood, it converts 
>> problems to a standard conic form, which requires (and certifies) that the 
>> problem is convex, and guarantees global optimality of the resulting 
>> solution.
>>
>
>

Re: [julia-users] Announcing Convex.jl - Convex Optimization in Julia

2015-02-04 Thread Elliot Saba
This is so so cool, Madeleine.  Thank you for sharing.  I'm a huge fan of
DCP, ever since I took a convex optimization course here at the UW (which
of course featured cvx and Boyd's book) and seeing this in Julia makes me
smile.
-E

On Wed, Feb 4, 2015 at 5:53 PM, Madeleine Udell 
wrote:

> Convex.jl  is a Julia library for
> mathematical programming that makes it easy to formulate and fast to solve
> nonlinear convex optimization problems. Convex.jl
>  is a member of the JuliaOpt
>  umbrella group and can use (nearly) any
> solver that complies with the MathProgBase interface, including Mosek
> , Gurobi
> , ECOS
> , SCS
> , and GLPK
> .
>
> Here's a quick example of code that solves a non-negative least-squares
> problem.
>
> using Convex
>
> # Generate random problem data
> m = 4;  n = 5
> A = randn(m, n); b = randn(m, 1)
>
> # Create a (column vector) variable of size n x 1.
> x = Variable(n)
>
> # The problem is to minimize ||Ax - b||^2 subject to x >= 0
> problem = minimize(sum_squares(A * x + b), [x >= 0])
>
> solve!(problem)
>
> We could instead solve a robust approximation problem by replacing 
> sum_squares(A
> * x + b) by sum(norm(A * x + b, 1)) or sum(huber(A * x + b)); it's that
> easy.
>
> Convex.jl  is different from JuMP
>  in that it allows (and prioritizes)
> linear algebraic and functional constructions in objectives and constraints
> (like max(x,y) < A*z). Under the hood, it converts problems to a standard
> conic form, which requires (and certifies) that the problem is convex, and
> guarantees global optimality of the resulting solution.
>


[julia-users] Announcing Convex.jl - Convex Optimization in Julia

2015-02-04 Thread Madeleine Udell


Convex.jl  is a Julia library for 
mathematical programming that makes it easy to formulate and fast to solve 
nonlinear convex optimization problems. Convex.jl 
 is a member of the JuliaOpt 
 umbrella group and can use (nearly) any 
solver that complies with the MathProgBase interface, including Mosek 
, Gurobi 
, ECOS 
, SCS 
, and GLPK 
.

Here's a quick example of code that solves a non-negative least-squares 
problem.

using Convex

# Generate random problem data
m = 4;  n = 5
A = randn(m, n); b = randn(m, 1)

# Create a (column vector) variable of size n x 1.
x = Variable(n)

# The problem is to minimize ||Ax - b||^2 subject to x >= 0
problem = minimize(sum_squares(A * x + b), [x >= 0])

solve!(problem)

We could instead solve a robust approximation problem by replacing 
sum_squares(A 
* x + b) by sum(norm(A * x + b, 1)) or sum(huber(A * x + b)); it's that 
easy.

Convex.jl  is different from JuMP 
 in that it allows (and prioritizes) 
linear algebraic and functional constructions in objectives and constraints 
(like max(x,y) < A*z). Under the hood, it converts problems to a standard 
conic form, which requires (and certifies) that the problem is convex, and 
guarantees global optimality of the resulting solution.