[Ipopt] IPOpt for l1 optimization?

Jonathan Hogg jonathan.hogg at stfc.ac.uk
Thu Apr 12 03:44:44 EDT 2012


It may also be worth looking at some of the algorithms used in 
compressed sensing. I'm told these algorithm don't have any strong 
complexity results (many are as poor in the worst case), but do exhibit 
good practical performance on many L1 optimization problems.

Jonathan.

On 11/04/12 18:51, Paul van Hoven wrote:
> Thank you for the answer Peter. Can you recommend some sources on this
> topic of transformation?
>
> Am 11. April 2012 18:33 schrieb Peter Carbonetto<pcarbo at uchicago.edu>:
>> Is there an absolute value in that objective function you are minimizing? If
>> so, then the answer is no, because the objective is non-smooth (it has
>> undefined derivatives at zeros). But you can convert this to an equivalent
>> smooth optimization problem with additional inequality constraints. There is
>> quite a bit of literature on this topic.
>>
>> Peter Carbonetto, Ph.D.
>> Postdoctoral Fellow
>> Dept. of Human Genetics
>> University of Chicago
>>
>>
>> On Wed, 11 Apr 2012, Paul van Hoven wrote:
>>
>>> I've got the following problem:
>>>
>>> min_x sum_{i=1}^N |<x,c_i>  |
>>> s.t. Ax<  0
>>>
>>> <x,c_i>  denotes the standard scalar product between x and c_i.
>>>
>>> Is this a problem that can be solved appropriately with IPOpt?
>>> _______________________________________________
>>> Ipopt mailing list
>>> Ipopt at list.coin-or.org
>>> http://list.coin-or.org/mailman/listinfo/ipopt
>>>
> _______________________________________________
> Ipopt mailing list
> Ipopt at list.coin-or.org
> http://list.coin-or.org/mailman/listinfo/ipopt

-- 
Scanned by iCritical.


More information about the Ipopt mailing list