First of all you'd better get rid of x with indexes and use normal named variables, e.g.

massOfGrass, massOfWater, massOfProtein, massOfCarbons = oovars(4)

constraints = [massOfGrass>0, massOfWater>0, (etc)]

If you see you've got a point into your objective that makes it out of domain, you should either return numpy.nan .

Another approach: get rid of massOfGrass + massOfWater + massOfProtein == 1 via

massOfWater, massOfProtein, massOfCarbons = oovars(3)

massOfGrass = 1 - massOfWater - massOfProtein

and then use box-bound solver, usually scipy_lbfgsb and algencan works better than others. They will not go outside of the box-bounded domain.

HTH, D.

]]>`constraints = [x[0]>0, x[1]>0, x[2]>0, x[3]>0, x[3]<1.5, x[1]+x[2]+x[3] == 1]`

somehow negative values seem to creep in. These aren't small negative values such as (for example) -1e-6, but rather large ones, approaching between -0.1 and -0.9. As I'm trying to solve a rather complex ODE model with invalid results for negative inputs, I need to prevent this from happening. The ODE model itself is solved independently of the optimization code, which only makes a single function call. I've connected the optimization and the ODE solution by creating an oofunc which makes the call to the ODE code, which then gets passed to the NLP constructor. I have also attempted to change the tolerances for the constraint functions, with no result.

Do let me know if I haven't provided enough details.

Cheers and many thanks,

Chinmay

In FuncDesigner it should look like this:

from FuncDesigner import *

from openopt import NLP

x = oovar()

constraints = (x[3]>0, x[3]<1.5, x[0]+x[1]+x[2] == 1) # maybe you should accompany it with personal toleranses, see doc

#define your objective here, e.g.

obj = sum(x**2)+2*x[2]

nlp = NLP(obj, {x:[0.8,0.06,0.14, 1.0/60]}, constraints=constraints)

optStruct = nlp.minimize('ralg')

See https://sage.openopt.org:8000/home/pub/32/ for results

]]>`x[0]+x[1]+x[2] == 1`

and

`0<x[3]<a_constant`

Doing this:

```
def constraint1(x):
print 'calling constraint 1, value is:', x,
if 0<x[3]<1.5:
return 0
else:
return 1
def constraint2(x):
print 'calling constraint 2, value is:', x,
if x[0]+x[1]+x[2]==1:
return 0
else:
return 1
constraints = [constraint1, constraint2]
nlp = NLP(f=goalFunction, x0=[0.8,0.06,0.14, 1/60], c=constraints)
optStruct = nlp.solve('ralg')
```

doesn't work, as x[3]<0 is still passed to goalFunction. I'm sure I'm doing something fundamentally wrong here, but even after digging through the documentation and the examples, I can't figure out what. The return values of the constraint functions when the constraints are satisfied are <=0 as specified in the documentation.

Cheers,

Chinmay Kanchi

PhD student

University of Birmingham,

Using OpenOpt to determine optimal gut structures for herbivores.