Github user debasish83 commented on the pull request:

    https://github.com/apache/spark/pull/458#issuecomment-42220945
  
    Depends how you solve L1 with lbfgs...
    
    OWL-QN for L1 is definitely a solution...
    
    You can also replace L1 as soft-max and but then you have to be careful
    with the schedule of soft-max smoothness....
    
    I think just pick OWL-QN for L1 (as it is implemented in breeze) and
    comparing against ADMM will be good....
    
    
    
    On Sun, May 4, 2014 at 10:31 PM, DB Tsai <notificati...@github.com> wrote:
    
    > lbfgs is not good for L1 problem. I'm working on and preparing to do
    > benchmark with bfgs variant OWL-QN for L1 which is ideal to be compared
    > with ADMM.
    >
    > —
    > Reply to this email directly or view it on 
GitHub<https://github.com/apache/spark/pull/458#issuecomment-42160096>
    > .
    >


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

Reply via email to