[RBFOpt] max_evaluations in RBFOpt
Chuong Thaidoan
chuongthaidoan at gmail.com
Tue Sep 29 21:02:31 EDT 2020
Dear Giacomo,
I am running your RBFOpt method for my function but it just finished at
1000 evaluations. Following your manual documentation, I set
"max_evaluations=5000" but it does not affect. Could you please let me know
how to increase the number of evaluations as my data has more than ten
thousand points?
Thank you in advance for your comments.
Best regards,
TD Chuong
On Fri, Sep 25, 2020 at 10:02 AM Chuong Thaidoan <chuongthaidoan at gmail.com>
wrote:
> Dear Giacomo,
> Thank you very much for your advice. I will study these suggestions.
> Best regards,
> TD Chuong
>
> On Thu, Sep 24, 2020 at 11:40 PM Giacomo Nannicini <giacomo.n at gmail.com>
> wrote:
>
>> There's no way to do that with RBFOpt. I also don't know of any
>> algorithm for black-box problems that does that type of computation.
>> In principle, after running the optimization, say, via an object "alg"
>> of class RbfoptAlgorithm, you can loop through alg.all_node_pos and
>> alg.all_node_val to see all points explored by the algorithm.The first
>> data structure contains the x coordinates, the second data structure
>> contains the corresponding objective function value.
>>
>> G
>>
>> On Thu, Sep 24, 2020 at 9:20 AM Chuong Thaidoan
>> <chuongthaidoan at gmail.com> wrote:
>> >
>> > Dear Giacomo,
>> > Thank you for your reply. Solutions that I mean are best solutions.
>> Since some problem may have several best solutions, and so I want to know
>> how to print out all such best solutions, not just one.
>> > Best regards,
>> > TD Chuong
>> >
>> > On Thu, Sep 24, 2020 at 11:00 PM Giacomo Nannicini <giacomo.n at gmail.com>
>> wrote:
>> >>
>> >> That x contains the best solution found. This is an unconstrained
>> >> problem so any point in the domain is a solution. I don't understand
>> >> what you mean by "print all the solutions".
>> >>
>> >> G
>> >>
>> >> On Thu, Sep 24, 2020 at 8:46 AM Chuong Thaidoan
>> >> <chuongthaidoan at gmail.com> wrote:
>> >> >
>> >> > Dear Giacomo,
>> >> > Thank you for your email. Yes, I mean that I would like to know all
>> solutions because if I print(x), it shows only one but the problem may have
>> multiple solutions. How can I print all solutions?
>> >> > Best regards,
>> >> > TD Chuong
>> >> >
>> >> > On Thu, Sep 24, 2020 at 9:58 PM Giacomo Nannicini <
>> giacomo.n at gmail.com> wrote:
>> >> >>
>> >> >> After running
>> >> >>
>> >> >> val, x, itercount, evalcount, fast_evalcount = alg.optimize()
>> >> >>
>> >> >> the solution is in x and the corresponding objective is in val.
>> >> >>
>> >> >> On Thu, Sep 24, 2020 at 2:34 AM Chuong Thaidoan
>> >> >> <chuongthaidoan at gmail.com> wrote:
>> >> >> >
>> >> >> > Dear Giacomo,
>> >> >> > I have studied your paper but I could not print out the
>> solutions. Could you please let me a Python syntax for print out solutions
>> x1..x3 and objective value.
>> >> >> > Thank you so much.
>> >> >> > Best regards,
>> >> >> > TD Chuong
>> >> >> >
>> >> >> > On Thu, Sep 24, 2020 at 12:28 PM Giacomo Nannicini <
>> giacomo.n at gmail.com> wrote:
>> >> >> >>
>> >> >> >> This output looks correct. Please read the manual if you want to
>> >> >> >> familiarize yourself with input and output.
>> >> >> >>
>> >> >> >> G
>> >> >> >>
>> >> >> >> On Wed, Sep 23, 2020 at 10:25 PM Chuong Thaidoan
>> >> >> >> <chuongthaidoan at gmail.com> wrote:
>> >> >> >> >
>> >> >> >> > Dear Giacomo,
>> >> >> >> > Thank you for your comments. I have specified more Path as
>> follows and it runs, but the result looks like incorrect because Gap=100
>> and there are no solutions x1 and x2. Please let me know your thoughts.
>> >> >> >> > Iter Cycle Action Objective value Time
>> Gap
>> >> >> >> > ---- ----- ------ --------------- ----
>> ---
>> >> >> >> > 0 0 Initialization -0.782797 0.01
>> 100.00 *
>> >> >> >> > 0 0 Initialization 69.095477 0.01
>> 100.00
>> >> >> >> > 0 0 Initialization 20.000000 0.01
>> 100.00
>> >> >> >> > 0 0 GlobalStep -9.829600 0.03
>> 100.00 *
>> >> >> >> > 1 0 GlobalStep -9.946732 0.05
>> 100.00 *
>> >> >> >> > 2 0 GlobalStep -0.133622 0.06
>> 100.00
>> >> >> >> > 3 0 GlobalStep -8.304083 0.08
>> 100.00
>> >> >> >> > 4 0 GlobalStep -9.999968 0.10
>> 100.00 *
>> >> >> >> > 5 0 AdjLocalStep -9.470315 0.16
>> 100.00
>> >> >> >> > 6 1 GlobalStep -9.859619 0.18
>> 100.00
>> >> >> >> > 7 1 GlobalStep -0.302782 0.20
>> 100.00
>> >> >> >> > 8 1 GlobalStep -5.820524 0.22
>> 100.00
>> >> >> >> > 9 1 GlobalStep -9.979040 0.23
>> 100.00
>> >> >> >> > 10 1 GlobalStep -9.989542 0.25
>> 100.00
>> >> >> >> > 11 1 AdjLocalStep -9.997112 0.28
>> 100.00
>> >> >> >> > 12 2 GlobalStep -5.381130 0.31
>> 100.00
>> >> >> >> > 13 2 GlobalStep -5.403951 0.33
>> 100.00
>> >> >> >> > 14 2 GlobalStep -9.747733 0.35
>> 100.00
>> >> >> >> > 15 2 GlobalStep -9.976688 0.36
>> 100.00
>> >> >> >> > 16 2 GlobalStep -9.999575 0.38
>> 100.00
>> >> >> >> > 17 2 AdjLocalStep -9.996449 0.41
>> 100.00
>> >> >> >> > 18 2 RefinementStep -9.998610 0.41
>> 100.00
>> >> >> >> > 19 2 RefinementStep -9.998642 0.41
>> 100.00
>> >> >> >> > 20 3 Discarded 0.41
>> >> >> >> > 21 3 GlobalStep -0.983375 0.46
>> 100.00
>> >> >> >> > 22 3 GlobalStep -4.919108 0.48
>> 100.00
>> >> >> >> > 23 3 GlobalStep -8.971488 0.50
>> 100.00
>> >> >> >> > 24 3 GlobalStep -9.999017 0.52
>> 100.00
>> >> >> >> > 25 3 GlobalStep -9.998435 0.53
>> 100.00
>> >> >> >> > 26 3 AdjLocalStep -9.999657 0.57
>> 100.00
>> >> >> >> > 27 4 GlobalStep -1.300168 0.62
>> 100.00
>> >> >> >> > 28 4 GlobalStep -6.542502 0.64
>> 100.00
>> >> >> >> > 29 4 GlobalStep -6.721890 0.66
>> 100.00
>> >> >> >> > 30 4 GlobalStep -9.999000 0.68
>> 100.00
>> >> >> >> > 31 4 GlobalStep -9.999754 0.70
>> 100.00
>> >> >> >> > 32 4 AdjLocalStep -9.997845 0.74
>> 100.00
>> >> >> >> > 33 5 GlobalStep 4.811411 0.80
>> 100.00
>> >> >> >> > 34 5 GlobalStep -3.788304 0.82
>> 100.00
>> >> >> >> > 35 5 GlobalStep -8.440984 0.84
>> 100.00
>> >> >> >> > 36 5 GlobalStep -9.999929 0.86
>> 100.00
>> >> >> >> > 37 5 GlobalStep -9.999214 0.88
>> 100.00
>> >> >> >> > 38 5 AdjLocalStep -9.993257 0.91
>> 100.00
>> >> >> >> > 39 6 Discarded 0.91
>> >> >> >> > 40 6 GlobalStep -2.861610 0.99
>> 100.00
>> >> >> >> > 41 6 GlobalStep -7.320371 1.01
>> 100.00
>> >> >> >> > 42 6 GlobalStep -7.355018 1.03
>> 100.00
>> >> >> >> > 43 6 GlobalStep -9.996332 1.05
>> 100.00
>> >> >> >> > 44 6 GlobalStep -9.997607 1.07
>> 100.00
>> >> >> >> > 45 6 AdjLocalStep -9.997380 1.11
>> 100.00
>> >> >> >> > 46 7 GlobalStep -2.944679 1.20
>> 100.00
>> >> >> >> > 47 7 GlobalStep -8.033679 1.22
>> 100.00
>> >> >> >> > 48 7 GlobalStep -9.055971 1.24
>> 100.00
>> >> >> >> > 49 7 GlobalStep -9.999349 1.26
>> 100.00
>> >> >> >> > 50 7 GlobalStep -9.999964 1.28
>> 100.00
>> >> >> >> > 51 7 AdjLocalStep -9.999116 1.33
>> 100.00
>> >> >> >> > 52 8 GlobalStep -1.229660 1.44
>> 100.00
>> >> >> >> > 53 8 GlobalStep 1.597072 1.46
>> 100.00
>> >> >> >> > 54 8 GlobalStep -9.829986 1.48
>> 100.00
>> >> >> >> > 55 8 GlobalStep -9.999921 1.50
>> 100.00
>> >> >> >> > 56 8 GlobalStep -9.999275 1.53
>> 100.00
>> >> >> >> > 57 8 AdjLocalStep -9.998799 1.57
>> 100.00
>> >> >> >> > 58 9 Discarded 1.57
>> >> >> >> > 59 9 GlobalStep 14.609887 1.70
>> 100.00
>> >> >> >> > 60 9 GlobalStep -3.468827 1.72
>> 100.00
>> >> >> >> > 61 9 GlobalStep -9.970708 1.75
>> 100.00
>> >> >> >> > 62 9 GlobalStep -9.994166 1.77
>> 100.00
>> >> >> >> > 63 9 GlobalStep -9.999956 1.79
>> 100.00
>> >> >> >> > 64 9 AdjLocalStep -9.997837 1.84
>> 100.00
>> >> >> >> > 65 10 GlobalStep -4.102803 1.98
>> 100.00
>> >> >> >> > 66 10 GlobalStep -6.107631 2.01
>> 100.00
>> >> >> >> > 67 10 GlobalStep -9.848015 2.04
>> 100.00
>> >> >> >> > 68 10 GlobalStep -9.994641 2.06
>> 100.00
>> >> >> >> > 69 10 GlobalStep -9.999667 2.08
>> 100.00
>> >> >> >> > 70 10 AdjLocalStep -9.996786 2.12
>> 100.00
>> >> >> >> > 71 11 GlobalStep -1.177266 2.29
>> 100.00
>> >> >> >> > 72 11 GlobalStep -6.633674 2.32
>> 100.00
>> >> >> >> > 73 11 GlobalStep -9.895041 2.34
>> 100.00
>> >> >> >> > 74 11 GlobalStep -9.990220 2.37
>> 100.00
>> >> >> >> > 75 11 GlobalStep -9.999315 2.39
>> 100.00
>> >> >> >> > 76 11 AdjLocalStep -9.997693 2.43
>> 100.00
>> >> >> >> > 77 12 Discarded 2.43
>> >> >> >> > 78 12 GlobalStep -5.171251 2.63
>> 100.00
>> >> >> >> > 79 12 GlobalStep -8.131647 2.65
>> 100.00
>> >> >> >> > 80 12 GlobalStep -7.982044 2.67
>> 100.00
>> >> >> >> > 81 12 GlobalStep -9.999500 2.70
>> 100.00
>> >> >> >> > 82 12 GlobalStep -9.997438 2.72
>> 100.00
>> >> >> >> > 83 12 AdjLocalStep -9.998610 2.76
>> 100.00
>> >> >> >> > 84 13 GlobalStep -0.259775 2.99
>> 100.00
>> >> >> >> > 85 13 GlobalStep -3.246013 3.01
>> 100.00
>> >> >> >> > 86 13 GlobalStep -7.577369 3.04
>> 100.00
>> >> >> >> > 87 13 GlobalStep -9.997907 3.06
>> 100.00
>> >> >> >> > 88 13 GlobalStep -9.999873 3.09
>> 100.00
>> >> >> >> > 89 13 AdjLocalStep -9.995549 3.14
>> 100.00
>> >> >> >> > 90 14 GlobalStep -0.308892 3.39
>> 100.00
>> >> >> >> > 91 14 GlobalStep -8.365626 3.43
>> 100.00
>> >> >> >> > 92 14 GlobalStep -7.925407 3.46
>> 100.00
>> >> >> >> > 93 14 GlobalStep -9.996190 3.49
>> 100.00
>> >> >> >> > 94 14 GlobalStep -9.995883 3.51
>> 100.00
>> >> >> >> > 95 14 AdjLocalStep -9.999649 3.57
>> 100.00
>> >> >> >> > 96 15 Discarded 3.57
>> >> >> >> > 97 15 GlobalStep 0.866069 3.86
>> 100.00
>> >> >> >> > 98 15 GlobalStep -5.017943 3.88
>> 100.00
>> >> >> >> > 99 15 GlobalStep -8.646658 3.91
>> 100.00
>> >> >> >> > 100 15 GlobalStep -9.998978 3.94
>> 100.00
>> >> >> >> > 101 15 GlobalStep -9.998460 3.97
>> 100.00
>> >> >> >> > 102 15 AdjLocalStep -9.997587 4.02
>> 100.00
>> >> >> >> > 103 16 Discarded 4.02
>> >> >> >> > 104 16 GlobalStep 21.677489 4.34
>> 100.00
>> >> >> >> > 105 16 GlobalStep -7.807238 4.37
>> 100.00
>> >> >> >> > 106 16 GlobalStep -8.019824 4.40
>> 100.00
>> >> >> >> > 107 16 GlobalStep -9.998993 4.43
>> 100.00
>> >> >> >> > 108 16 GlobalStep -9.998398 4.45
>> 100.00
>> >> >> >> > 109 16 AdjLocalStep -9.999986 4.49
>> 100.00 *
>> >> >> >> > 110 16 RefinementStep -9.329509 4.49
>> 100.00
>> >> >> >> > 111 16 RefinementStep -9.329523 4.49
>> 100.00
>> >> >> >> > 112 17 Discarded 4.49
>> >> >> >> > 113 17 Restart 4.49
>> >> >> >> > 113 18 Initialization 8.783424 4.50
>> 100.00
>> >> >> >> > 113 18 Initialization 16.720115 4.50
>> 100.00
>> >> >> >> > 113 18 Initialization 20.000000 4.50
>> 100.00
>> >> >> >> > 113 18 GlobalStep 99.437119 4.51
>> 100.00
>> >> >> >> > 114 18 GlobalStep -9.979459 4.53
>> 100.00
>> >> >> >> > 115 18 GlobalStep -9.991883 4.55
>> 100.00
>> >> >> >> > 116 18 GlobalStep -9.967561 4.57
>> 100.00
>> >> >> >> > 117 18 GlobalStep -9.998844 4.58
>> 100.00
>> >> >> >> > 118 18 AdjLocalStep -9.999992 4.62
>> 100.00 *
>> >> >> >> > 119 19 GlobalStep -0.088290 4.64
>> 100.00
>> >> >> >> > 120 19 GlobalStep -5.029370 4.65
>> 100.00
>> >> >> >> > 121 19 GlobalStep -9.968159 4.67
>> 100.00
>> >> >> >> > 122 19 GlobalStep -9.996467 4.69
>> 100.00
>> >> >> >> > 123 19 GlobalStep -9.999552 4.71
>> 100.00
>> >> >> >> > 124 19 AdjLocalStep -9.992407 4.74
>> 100.00
>> >> >> >> > 125 20 GlobalStep -9.939811 4.77
>> 100.00
>> >> >> >> > 126 20 GlobalStep -5.045766 4.79
>> 100.00
>> >> >> >> > 127 20 GlobalStep -0.974566 4.81
>> 100.00
>> >> >> >> > 128 20 GlobalStep -9.998966 4.83
>> 100.00
>> >> >> >> > 129 20 GlobalStep -9.999806 4.85
>> 100.00
>> >> >> >> > 130 20 AdjLocalStep -9.990275 4.88
>> 100.00
>> >> >> >> > 131 20 RefinementStep -2.667284 4.88
>> 100.00
>> >> >> >> > 132 21 Discarded 4.88
>> >> >> >> > 133 21 GlobalStep 0.306504 4.92
>> 100.00
>> >> >> >> > 134 21 GlobalStep -2.309693 4.94
>> 100.00
>> >> >> >> > 135 21 GlobalStep -4.116201 4.96
>> 100.00
>> >> >> >> > 136 21 GlobalStep -8.352455 4.98
>> 100.00
>> >> >> >> > 137 21 GlobalStep -9.984745 5.00
>> 100.00
>> >> >> >> > 138 21 AdjLocalStep -9.999261 5.03
>> 100.00
>> >> >> >> > 139 22 GlobalStep -6.263660 5.08
>> 100.00
>> >> >> >> > 140 22 GlobalStep -7.134940 5.10
>> 100.00
>> >> >> >> > 141 22 GlobalStep -9.707888 5.12
>> 100.00
>> >> >> >> > 142 22 GlobalStep -8.354821 5.14
>> 100.00
>> >> >> >> > 143 22 GlobalStep -9.990861 5.16
>> 100.00
>> >> >> >> > 144 22 AdjLocalStep -9.999445 5.20
>> 100.00
>> >> >> >> > 145 23 GlobalStep -1.088456 5.26
>> 100.00
>> >> >> >> > 146 23 GlobalStep -7.013278 5.28
>> 100.00
>> >> >> >> > 147 23 GlobalStep -8.368093 5.30
>> 100.00
>> >> >> >> > 148 23 GlobalStep -8.757719 5.32
>> 100.00
>> >> >> >> > 149 23 GlobalStep -9.989832 5.34
>> 100.00
>> >> >> >> > 150 23 AdjLocalStep -9.996497 5.38
>> 100.00
>> >> >> >> > 151 24 Discarded 5.38
>> >> >> >> > 152 24 GlobalStep 2.107940 5.45
>> 100.00
>> >> >> >> > 153 24 GlobalStep -3.436103 5.47
>> 100.00
>> >> >> >> > 154 24 GlobalStep -6.037814 5.49
>> 100.00
>> >> >> >> > 155 24 GlobalStep -9.056932 5.52
>> 100.00
>> >> >> >> > 156 24 GlobalStep -9.997458 5.54
>> 100.00
>> >> >> >> > 157 24 AdjLocalStep -9.996017 5.58
>> 100.00
>> >> >> >> > 158 25 GlobalStep -0.142017 5.66
>> 100.00
>> >> >> >> > 159 25 GlobalStep -3.536719 5.68
>> 100.00
>> >> >> >> > 160 25 GlobalStep -7.619205 5.70
>> 100.00
>> >> >> >> > 161 25 GlobalStep -9.152211 5.73
>> 100.00
>> >> >> >> > 162 25 GlobalStep -9.997099 5.75
>> 100.00
>> >> >> >> > 163 25 AdjLocalStep -9.998803 5.79
>> 100.00
>> >> >> >> > 164 26 GlobalStep -6.928876 5.89
>> 100.00
>> >> >> >> > 165 26 GlobalStep -6.330172 5.91
>> 100.00
>> >> >> >> > 166 26 GlobalStep -9.790826 5.94
>> 100.00
>> >> >> >> > 167 26 GlobalStep -9.125377 5.96
>> 100.00
>> >> >> >> > 168 26 GlobalStep -9.997595 5.98
>> 100.00
>> >> >> >> > 169 26 AdjLocalStep -9.998836 6.02
>> 100.00
>> >> >> >> > 170 27 Discarded 6.02
>> >> >> >> > 171 27 GlobalStep -3.465366 6.14
>> 100.00
>> >> >> >> > 172 27 GlobalStep -4.890849 6.17
>> 100.00
>> >> >> >> > 173 27 GlobalStep -9.227066 6.19
>> 100.00
>> >> >> >> > 174 27 GlobalStep -9.350088 6.21
>> 100.00
>> >> >> >> > 175 27 GlobalStep -9.997056 6.24
>> 100.00
>> >> >> >> > 176 27 AdjLocalStep -9.998672 6.28
>> 100.00
>> >> >> >> > 177 28 GlobalStep -0.107919 6.43
>> 100.00
>> >> >> >> > 178 28 GlobalStep -9.614442 6.45
>> 100.00
>> >> >> >> > 179 28 GlobalStep -7.720567 6.48
>> 100.00
>> >> >> >> > 180 28 GlobalStep -9.361315 6.50
>> 100.00
>> >> >> >> > 181 28 GlobalStep -9.996503 6.52
>> 100.00
>> >> >> >> > 182 28 AdjLocalStep -9.999594 6.57
>> 100.00
>> >> >> >> > 183 29 GlobalStep -3.796607 6.74
>> 100.00
>> >> >> >> > 184 29 GlobalStep -2.367116 6.76
>> 100.00
>> >> >> >> > 185 29 GlobalStep -5.150750 6.79
>> 100.00
>> >> >> >> > 186 29 GlobalStep -9.415172 6.81
>> 100.00
>> >> >> >> > 187 29 GlobalStep -9.996580 6.83
>> 100.00
>> >> >> >> > 188 29 AdjLocalStep -9.997717 6.87
>> 100.00
>> >> >> >> > 189 30 Discarded 6.87
>> >> >> >> > 190 30 GlobalStep 2.992339 7.07
>> 100.00
>> >> >> >> > 191 30 GlobalStep -7.956459 7.09
>> 100.00
>> >> >> >> > 192 30 GlobalStep -7.662510 7.12
>> 100.00
>> >> >> >> > 193 30 GlobalStep -9.436481 7.14
>> 100.00
>> >> >> >> > 194 30 GlobalStep -9.993961 7.17
>> 100.00
>> >> >> >> > 195 30 AdjLocalStep -9.996651 7.21
>> 100.00
>> >> >> >> > 196 31 GlobalStep 27.107300 7.43
>> 100.00
>> >> >> >> > 197 31 GlobalStep -2.740703 7.46
>> 100.00
>> >> >> >> > 198 31 GlobalStep -6.743054 7.48
>> 100.00
>> >> >> >> > 199 31 GlobalStep -9.545115 7.50
>> 100.00
>> >> >> >> > 200 31 GlobalStep -9.991679 7.53
>> 100.00
>> >> >> >> > 201 31 AdjLocalStep -9.999586 7.58
>> 100.00
>> >> >> >> > Exception ignored in: <_io.FileIO name=3 mode='rb+'
>> closefd=True>
>> >> >> >> > ResourceWarning: unclosed file <_io.FileIO name=3 mode='rb+'
>> closefd=True>
>> >> >> >> > 202 32 GlobalStep 28.311026 7.82
>> 100.00
>> >> >> >> > 203 32 GlobalStep -6.823784 7.85
>> 100.00
>> >> >> >> > 204 32 GlobalStep -8.490190 7.88
>> 100.00
>> >> >> >> > 205 32 GlobalStep -9.568747 7.90
>> 100.00
>> >> >> >> > 206 32 GlobalStep -9.994977 7.94
>> 100.00
>> >> >> >> > 207 32 AdjLocalStep -9.997207 7.99
>> 100.00
>> >> >> >> > 208 33 Discarded 7.99
>> >> >> >> > 209 33 GlobalStep 7.712229 8.26
>> 100.00
>> >> >> >> > 210 33 GlobalStep -2.040756 8.29
>> 100.00
>> >> >> >> > 211 33 GlobalStep -8.910636 8.31
>> 100.00
>> >> >> >> > 212 33 GlobalStep -9.574243 8.34
>> 100.00
>> >> >> >> > 213 33 GlobalStep -9.998942 8.36
>> 100.00
>> >> >> >> > 214 33 AdjLocalStep -9.995341 8.41
>> 100.00
>> >> >> >> > 215 34 Discarded 8.41
>> >> >> >> > 216 34 GlobalStep 9.054157 8.71
>> 100.00
>> >> >> >> > 217 34 GlobalStep -3.606081 8.74
>> 100.00
>> >> >> >> > 218 34 GlobalStep -8.535352 8.76
>> 100.00
>> >> >> >> > 219 34 GlobalStep -9.626389 8.79
>> 100.00
>> >> >> >> > 220 34 GlobalStep -9.994735 8.81
>> 100.00
>> >> >> >> > 221 34 AdjLocalStep -9.999854 8.87
>> 100.00
>> >> >> >> > 222 35 Discarded 8.87
>> >> >> >> > 223 35 Restart 8.87
>> >> >> >> > 223 36 Initialization -8.412199 8.88
>> 100.00
>> >> >> >> > 223 36 Initialization 14.167440 8.88
>> 100.00
>> >> >> >> > 223 36 Initialization 20.000000 8.88
>> 100.00
>> >> >> >> > 223 36 GlobalStep -9.998146 8.89
>> 100.00
>> >> >> >> > 224 36 GlobalStep -9.995897 8.91
>> 100.00
>> >> >> >> > 225 36 GlobalStep -9.924097 8.93
>> 100.00
>> >> >> >> > 226 36 GlobalStep -9.974545 8.94
>> 100.00
>> >> >> >> > 227 36 GlobalStep -8.360688 8.96
>> 100.00
>> >> >> >> > 228 36 AdjLocalStep -9.998387 8.99
>> 100.00
>> >> >> >> > 229 37 GlobalStep -0.059815 9.02
>> 100.00
>> >> >> >> > 230 37 GlobalStep -0.045566 9.03
>> 100.00
>> >> >> >> > 231 37 GlobalStep -9.921982 9.05
>> 100.00
>> >> >> >> > 232 37 GlobalStep -9.988803 9.07
>> 100.00
>> >> >> >> > 233 37 GlobalStep -9.996431 9.09
>> 100.00
>> >> >> >> > 234 37 AdjLocalStep -9.999170 9.12
>> 100.00
>> >> >> >> > 235 38 GlobalStep -4.327294 9.15
>> 100.00
>> >> >> >> > 236 38 GlobalStep -4.581352 9.17
>> 100.00
>> >> >> >> > 237 38 GlobalStep -6.757061 9.18
>> 100.00
>> >> >> >> > 238 38 GlobalStep -9.991722 9.20
>> 100.00
>> >> >> >> > 239 38 GlobalStep -9.994365 9.22
>> 100.00
>> >> >> >> > 240 38 AdjLocalStep -9.999325 9.25
>> 100.00
>> >> >> >> > 241 38 RefinementStep -7.303472 9.25
>> 100.00
>> >> >> >> > 242 38 RefinementStep -7.304148 9.26
>> 100.00
>> >> >> >> > 243 39 Discarded 9.26
>> >> >> >> > 244 39 GlobalStep -9.432590 9.29
>> 100.00
>> >> >> >> > 245 39 GlobalStep -6.698595 9.31
>> 100.00
>> >> >> >> > 246 39 GlobalStep -1.372501 9.33
>> 100.00
>> >> >> >> > 247 39 GlobalStep -9.981998 9.35
>> 100.00
>> >> >> >> > 248 39 GlobalStep -9.992001 9.37
>> 100.00
>> >> >> >> > 249 39 AdjLocalStep -9.991496 9.41
>> 100.00
>> >> >> >> > 250 40 GlobalStep -4.736954 9.46
>> 100.00
>> >> >> >> > 251 40 GlobalStep -9.905023 9.47
>> 100.00
>> >> >> >> > 252 40 GlobalStep -9.612855 9.49
>> 100.00
>> >> >> >> > 253 40 GlobalStep -9.998329 9.52
>> 100.00
>> >> >> >> > 254 40 GlobalStep -9.995420 9.53
>> 100.00
>> >> >> >> > 255 40 AdjLocalStep -9.998503 9.57
>> 100.00
>> >> >> >> > 256 41 GlobalStep -0.211957 9.63
>> 100.00
>> >> >> >> > 257 41 GlobalStep -4.094734 9.65
>> 100.00
>> >> >> >> > 258 41 GlobalStep -3.472883 9.68
>> 100.00
>> >> >> >> > 259 41 GlobalStep -9.996027 9.70
>> 100.00
>> >> >> >> > 260 41 GlobalStep -9.998652 9.72
>> 100.00
>> >> >> >> > 261 41 AdjLocalStep -9.999681 9.75
>> 100.00
>> >> >> >> > 262 41 RefinementStep -6.308952 9.75
>> 100.00
>> >> >> >> > 263 41 RefinementStep -6.309271 9.76
>> 100.00
>> >> >> >> > 264 42 Discarded 9.76
>> >> >> >> > 265 42 GlobalStep -4.248808 9.84
>> 100.00
>> >> >> >> > 266 42 GlobalStep -2.394217 9.86
>> 100.00
>> >> >> >> > 267 42 GlobalStep -6.982851 9.88
>> 100.00
>> >> >> >> > 268 42 GlobalStep -9.996277 9.90
>> 100.00
>> >> >> >> > 269 42 GlobalStep -9.993987 9.92
>> 100.00
>> >> >> >> > 270 42 AdjLocalStep -9.998898 9.96
>> 100.00
>> >> >> >> > 271 43 GlobalStep -5.570539 10.05
>> 100.00
>> >> >> >> > 272 43 GlobalStep -7.641360 10.08
>> 100.00
>> >> >> >> > 273 43 GlobalStep -9.959525 10.10
>> 100.00
>> >> >> >> > 274 43 GlobalStep -9.991615 10.12
>> 100.00
>> >> >> >> > 275 43 GlobalStep -9.998084 10.14
>> 100.00
>> >> >> >> > 276 43 AdjLocalStep -9.997681 10.18
>> 100.00
>> >> >> >> > 277 44 GlobalStep -5.803065 10.29
>> 100.00
>> >> >> >> > 278 44 GlobalStep -7.090334 10.32
>> 100.00
>> >> >> >> > 279 44 GlobalStep -7.667018 10.34
>> 100.00
>> >> >> >> > 280 44 GlobalStep -9.991742 10.36
>> 100.00
>> >> >> >> > 281 44 GlobalStep -9.999368 10.38
>> 100.00
>> >> >> >> > 282 44 AdjLocalStep -9.996251 10.43
>> 100.00
>> >> >> >> > 283 45 Discarded 10.43
>> >> >> >> > 284 45 GlobalStep 3.850628 10.56
>> 100.00
>> >> >> >> > 285 45 GlobalStep -2.721334 10.58
>> 100.00
>> >> >> >> > 286 45 GlobalStep -8.301068 10.61
>> 100.00
>> >> >> >> > 287 45 GlobalStep -8.591339 10.63
>> 100.00
>> >> >> >> > 288 45 GlobalStep -9.996616 10.65
>> 100.00
>> >> >> >> > 289 45 AdjLocalStep -9.999346 10.70
>> 100.00
>> >> >> >> > 290 46 Discarded 10.70
>> >> >> >> > 291 46 GlobalStep 10.209351 10.86
>> 100.00
>> >> >> >> > 292 46 GlobalStep -7.894794 10.89
>> 100.00
>> >> >> >> > 293 46 GlobalStep -8.234840 10.91
>> 100.00
>> >> >> >> > 294 46 GlobalStep -8.838450 10.94
>> 100.00
>> >> >> >> > 295 46 GlobalStep -9.999002 10.96
>> 100.00
>> >> >> >> > 296 46 AdjLocalStep -9.998982 11.01
>> 100.00
>> >> >> >> > 297 47 Discarded 11.01
>> >> >> >> > 298 47 GlobalStep -1.463510 11.19
>> 100.00
>> >> >> >> > 299 47 GlobalStep -4.827453 11.21
>> 100.00
>> >> >> >> > 300 47 GlobalStep -6.813960 11.23
>> 100.00
>> >> >> >> > 301 47 GlobalStep -9.092894 11.26
>> 100.00
>> >> >> >> > 302 47 GlobalStep -9.995715 11.28
>> 100.00
>> >> >> >> > 303 47 AdjLocalStep -9.997012 11.33
>> 100.00
>> >> >> >> > 304 48 Discarded 11.33
>> >> >> >> > 305 48 GlobalStep -1.832841 11.53
>> 100.00
>> >> >> >> > 306 48 GlobalStep -3.611980 11.55
>> 100.00
>> >> >> >> > 307 48 GlobalStep -9.815830 11.58
>> 100.00
>> >> >> >> > 308 48 GlobalStep -9.129074 11.61
>> 100.00
>> >> >> >> > 309 48 GlobalStep -9.994837 11.63
>> 100.00
>> >> >> >> > 310 48 AdjLocalStep -9.999636 11.68
>> 100.00
>> >> >> >> > 311 49 Discarded 11.68
>> >> >> >> > 312 49 GlobalStep -8.396622 11.92
>> 100.00
>> >> >> >> > 313 49 GlobalStep -8.182026 11.94
>> 100.00
>> >> >> >> > Summary: iters 314 evals 300 noisy_evals 0 cycles 49
>> opt_time 11.94 tot_time 11.94 obj -9.999992 gap 100.00
>> >> >> >> >
>> >> >> >> > Process finished with exit code 0
>> >> >> >> >
>> >> >> >> > The code:
>> >> >> >> >
>> >> >> >> > import rbfopt
>> >> >> >> > #settings =
>> rbfopt.RbfoptSettings(minlp_solver_path='Cygwin64/home/chuong/bonmin-stable/build/bonmin',\
>> >> >> >> > #
>> nlp_solver_path='Cygwin64/home/chuong/bonmin-stable/build/ipopt')
>> >> >> >> >
>> >> >> >> > import numpy as np
>> >> >> >> > def obj_funct(x):
>> >> >> >> > return x[0]*x[1] - x[2]
>> >> >> >> >
>> >> >> >> > bb = rbfopt.RbfoptUserBlackBox(3, np.array([0] * 3),
>> np.array([10] * 3),
>> >> >> >> > np.array(['R', 'I', 'R']),
>> obj_funct)
>> >> >> >> > #settings =
>> rbfopt.RbfoptSettings(minlp_solver_path='/home/chuong/bonmin-stable/build/bonmin',\
>> >> >> >> > #
>> nlp_solver_path='/home/chuong/bonmin-stable/build/ipopt')
>> >> >> >> > settings =
>> rbfopt.RbfoptSettings(minlp_solver_path="C:/cygwin64/home/chuong/Bonmin-stable/build/Bonmin/bonmin.pc",\
>> >> >> >> >
>> nlp_solver_path="C:/cygwin64/home/chuong/Bonmin-stable/build/Ipopt/ipopt.pc")
>> >> >> >> > #settings = rbfopt.RbfoptSettings(max_evaluations=50)
>> >> >> >> > alg = rbfopt.RbfoptAlgorithm(settings, bb)
>> >> >> >> > val, x, itercount, evalcount, fast_evalcount = alg.optimize()
>> >> >> >> >
>> >> >> >> >
>> >> >> >> > On Thu, Sep 24, 2020 at 11:24 AM Giacomo Nannicini <
>> giacomo.n at gmail.com> wrote:
>> >> >> >> >>
>> >> >> >> >> Make sure the file is in the proper location and the path is
>> specified
>> >> >> >> >> in a way that Python understands it. Doesn't look correct at
>> first
>> >> >> >> >> glance (slash missing).
>> >> >> >> >>
>> >> >> >> >> G
>> >> >> >> >>
>> >> >> >> >> On Wed, Sep 23, 2020 at 9:17 PM Chuong Thaidoan
>> >> >> >> >> <chuongthaidoan at gmail.com> wrote:
>> >> >> >> >> >
>> >> >> >> >> > Dear Giacomo,
>> >> >> >> >> > Thank you for your reply.
>> >> >> >> >> > I have re-ordered the settings of Path as follows, but the
>> issue is still the same: Do you think Bonmin was not properly installed?
>> >> >> >> >> > Iter Cycle Action Objective value Time
>> Gap
>> >> >> >> >> > ---- ----- ------ ---------------
>> ---- ---
>> >> >> >> >> > 0 0 Initialization -0.782797
>> 0.01 100.00 *
>> >> >> >> >> > 0 0 Initialization 69.095477
>> 0.01 100.00
>> >> >> >> >> > 0 0 Initialization 20.000000
>> 0.01 100.00
>> >> >> >> >> > 0 0 GlobalStep -9.829600
>> 0.03 100.00 *
>> >> >> >> >> > 1 0 GlobalStep -9.946732
>> 0.05 100.00 *
>> >> >> >> >> > 2 0 GlobalStep -0.133622
>> 0.07 100.00
>> >> >> >> >> > 3 0 GlobalStep -8.304083
>> 0.09 100.00
>> >> >> >> >> > 4 0 GlobalStep -9.999968
>> 0.11 100.00 *
>> >> >> >> >> > Traceback (most recent call last):
>> >> >> >> >> > File
>> "C:/Users/chuong/PycharmProjects/LearningPython/RBFOpt_test.py", line 15,
>> in <module>
>> >> >> >> >> > val, x, itercount, evalcount, fast_evalcount =
>> alg.optimize()
>> >> >> >> >> > File
>> "C:\ProgramData\Anaconda3\lib\site-packages\rbfopt\rbfopt_algorithm.py",
>> line 795, in optimize
>> >> >> >> >> > self.optimize_serial(pause_after_iters)
>> >> >> >> >> > File
>> "C:\ProgramData\Anaconda3\lib\site-packages\rbfopt\rbfopt_algorithm.py",
>> line 1056, in optimize_serial
>> >> >> >> >> > self.node_is_noisy)
>> >> >> >> >> > File
>> "C:\ProgramData\Anaconda3\lib\site-packages\rbfopt\rbfopt_algorithm.py",
>> line 2438, in local_step
>> >> >> >> >> > categorical_info, node_pos, rbf_lambda, rbf_h,
>> node_pos[fmin_index])
>> >> >> >> >> > File
>> "C:\ProgramData\Anaconda3\lib\site-packages\rbfopt\rbfopt_aux_problems.py",
>> line 297, in minimize_rbf
>> >> >> >> >> > if (not opt.available()):
>> >> >> >> >> > File
>> "C:\ProgramData\Anaconda3\lib\site-packages\pyomo\opt\base\solvers.py",
>> line 99, in available
>> >> >> >> >> > raise pyutilib.common.ApplicationError("Solver (%s) not
>> available" % str(self.name))
>> >> >> >> >> > pyutilib.common._exceptions.ApplicationError: Solver
>> (bonmin) not available
>> >> >> >> >> >
>> >> >> >> >> > Process finished with exit code 1
>> >> >> >> >> >
>> >> >> >> >> > import numpy as np
>> >> >> >> >> > def obj_funct(x):
>> >> >> >> >> > return x[0]*x[1] - x[2]
>> >> >> >> >> >
>> >> >> >> >> > bb = rbfopt.RbfoptUserBlackBox(3, np.array([0] * 3),
>> np.array([10] * 3),
>> >> >> >> >> > np.array(['R', 'I', 'R']),
>> obj_funct)
>> >> >> >> >> > settings =
>> rbfopt.RbfoptSettings(minlp_solver_path='c:Cygwin64/home/chuong/bonmin-stable/build/bonmin',\
>> >> >> >> >> >
>> nlp_solver_path='c:Cygwin64/home/chuong/bonmin-stable/build/ipopt')
>> >> >> >> >> > #settings = rbfopt.RbfoptSettings(max_evaluations=50)
>> >> >> >> >> > alg = rbfopt.RbfoptAlgorithm(settings, bb)
>> >> >> >> >> > val, x, itercount, evalcount, fast_evalcount =
>> alg.optimize()
>> >> >> >> >> >
>> >> >> >> >> >
>> >> >> >> >> > On Thu, Sep 24, 2020 at 10:59 AM Giacomo Nannicini <
>> giacomo.n at gmail.com> wrote:
>> >> >> >> >> >>
>> >> >> >> >> >> Yes but in the code snippet you sent in the previous
>> email, you are
>> >> >> >> >> >> then passing a different RbfoptSettings object to
>> RbfoptAlgorithm --
>> >> >> >> >> >> not the one for which you set the path.
>> >> >> >> >> >>
>> >> >> >> >> >> On Wed, Sep 23, 2020 at 8:50 PM Chuong Thaidoan
>> >> >> >> >> >> <chuongthaidoan at gmail.com> wrote:
>> >> >> >> >> >> >
>> >> >> >> >> >> > Dear Giacomo,
>> >> >> >> >> >> > Thank you for your prompt reply. I already set the Path
>> in the second line of the Python code, which is as follows:
>> >> >> >> >> >> >
>> >> >> >> >> >> > settings =
>> rbfopt.RbfoptSettings(minlp_solver_path='/home/chuong/bonmin-stable/build/bonmin',\
>> >> >> >> >> >> >
>> nlp_solver_path='/home/chuong/bonmin-stable/build/ipopt')
>> >> >> >> >> >> >
>> >> >> >> >> >> > or
>> >> >> >> >> >> >
>> >> >> >> >> >> > import rbfopt
>> >> >> >> >> >> > settings =
>> rbfopt.RbfoptSettings(minlp_solver_path='Cygwin64/home/chuong/bonmin-stable/build/bonmin',\
>> >> >> >> >> >> >
>> nlp_solver_path='Cygwin64/home/chuong/bonmin-stable/build/ipopt')
>> >> >> >> >> >> >
>> >> >> >> >> >> >
>> >> >> >> >> >> > On Thu, Sep 24, 2020 at 10:43 AM Giacomo Nannicini <
>> giacomo.n at gmail.com> wrote:
>> >> >> >> >> >> >>
>> >> >> >> >> >> >> I do not understand the order of the code in the Python
>> snippet.
>> >> >> >> >> >> >> If bonmin and ipopt are not in your part, you must set
>> the
>> >> >> >> >> >> >> corresponding options in the RbfoptSettings object that
>> is passed to
>> >> >> >> >> >> >> RbfoptAlgorithm.
>> >> >> >> >> >> >>
>> >> >> >> >> >> >> Giacomo
>> >> >> >> >> >> >>
>> >> >> >> >> >> >> On Wed, Sep 23, 2020 at 8:38 PM Chuong Thaidoan
>> >> >> >> >> >> >> <chuongthaidoan at gmail.com> wrote:
>> >> >> >> >> >> >> >
>> >> >> >> >> >> >> > Dear Giacomo,
>> >> >> >> >> >> >> > Thank you for your advice. I have just to re-install
>> Bonmin and run the code. It shows errors as follows. Could you please take
>> a look and let me know some further comments?
>> >> >> >> >> >> >> > Iter Cycle Action Objective value
>> Time Gap
>> >> >> >> >> >> >> > ---- ----- ------ ---------------
>> ---- ---
>> >> >> >> >> >> >> > 0 0 Initialization -0.782797
>> 0.01 100.00 *
>> >> >> >> >> >> >> > 0 0 Initialization 69.095477
>> 0.01 100.00
>> >> >> >> >> >> >> > 0 0 Initialization 20.000000
>> 0.01 100.00
>> >> >> >> >> >> >> > 0 0 GlobalStep -9.829600
>> 0.03 100.00 *
>> >> >> >> >> >> >> > 1 0 GlobalStep -9.946732
>> 0.06 100.00 *
>> >> >> >> >> >> >> > 2 0 GlobalStep -0.133622
>> 0.08 100.00
>> >> >> >> >> >> >> > 3 0 GlobalStep -8.304083
>> 0.10 100.00
>> >> >> >> >> >> >> > 4 0 GlobalStep -9.999968
>> 0.12 100.00 *
>> >> >> >> >> >> >> > Traceback (most recent call last):
>> >> >> >> >> >> >> > File
>> "C:/Users/chuong/PycharmProjects/LearningPython/Ipopt/RBFOpt_test_ipopt.py",
>> line 14, in <module>
>> >> >> >> >> >> >> > val, x, itercount, evalcount, fast_evalcount =
>> alg.optimize()
>> >> >> >> >> >> >> > File
>> "C:\ProgramData\Anaconda3\lib\site-packages\rbfopt\rbfopt_algorithm.py",
>> line 795, in optimize
>> >> >> >> >> >> >> > self.optimize_serial(pause_after_iters)
>> >> >> >> >> >> >> > File
>> "C:\ProgramData\Anaconda3\lib\site-packages\rbfopt\rbfopt_algorithm.py",
>> line 1056, in optimize_serial
>> >> >> >> >> >> >> > self.node_is_noisy)
>> >> >> >> >> >> >> > File
>> "C:\ProgramData\Anaconda3\lib\site-packages\rbfopt\rbfopt_algorithm.py",
>> line 2438, in local_step
>> >> >> >> >> >> >> > categorical_info, node_pos, rbf_lambda, rbf_h,
>> node_pos[fmin_index])
>> >> >> >> >> >> >> > File
>> "C:\ProgramData\Anaconda3\lib\site-packages\rbfopt\rbfopt_aux_problems.py",
>> line 297, in minimize_rbf
>> >> >> >> >> >> >> > if (not opt.available()):
>> >> >> >> >> >> >> > File
>> "C:\ProgramData\Anaconda3\lib\site-packages\pyomo\opt\base\solvers.py",
>> line 99, in available
>> >> >> >> >> >> >> > raise pyutilib.common.ApplicationError("Solver
>> (%s) not available" % str(self.name))
>> >> >> >> >> >> >> > pyutilib.common._exceptions.ApplicationError: Solver
>> (bonmin) not available
>> >> >> >> >> >> >> >
>> >> >> >> >> >> >> > Your code:
>> >> >> >> >> >> >> >
>> >> >> >> >> >> >> > import rbfopt
>> >> >> >> >> >> >> >
>> >> >> >> >> >> >> > settings =
>> rbfopt.RbfoptSettings(minlp_solver_path='/home/chuong/bonmin-stable/build/bonmin',\
>> >> >> >> >> >> >> >
>> nlp_solver_path='/home/chuong/bonmin-stable/build/ipopt')
>> >> >> >> >> >> >> >
>> >> >> >> >> >> >> > import numpy as np
>> >> >> >> >> >> >> > def obj_funct(x):
>> >> >> >> >> >> >> > return x[0]*x[1] - x[2]
>> >> >> >> >> >> >> >
>> >> >> >> >> >> >> > bb = rbfopt.RbfoptUserBlackBox(3, np.array([0] * 3),
>> np.array([10] * 3),
>> >> >> >> >> >> >> > np.array(['R', 'I',
>> 'R']), obj_funct)
>> >> >> >> >> >> >> > settings = rbfopt.RbfoptSettings(max_evaluations=50)
>> >> >> >> >> >> >> > alg = rbfopt.RbfoptAlgorithm(settings, bb)
>> >> >> >> >> >> >> > val, x, itercount, evalcount, fast_evalcount =
>> alg.optimize()
>> >> >> >> >> >> >> >
>> >> >> >> >> >> >> >
>> >> >> >> >> >> >> > On Wed, Sep 23, 2020 at 11:43 PM Giacomo Nannicini <
>> giacomo.n at gmail.com> wrote:
>> >> >> >> >> >> >> >>
>> >> >> >> >> >> >> >> You need both executables. You can download them
>> from AMPL's website. If you compile from scratch, then Bonmin will also
>> compile Ipopt.
>> >> >> >> >> >> >> >>
>> >> >> >> >> >> >> >> If the executables are not in the system path, you
>> can specify their location via options.
>> >> >> >> >> >> >> >>
>> >> >> >> >> >> >> >>
>> >> >> >> >> >> >> >> G
>> >> >> >> >> >> >> >>
>> >> >> >> >> >> >> >>
>> >> >> >> >> >> >> >> On Wed, Sep 23, 2020, 9:36 AM Chuong Thaidoan <
>> chuongthaidoan at gmail.com> wrote:
>> >> >> >> >> >> >> >>>
>> >> >> >> >> >> >> >>> Dear Giacomo,
>> >> >> >> >> >> >> >>> Thank you for your email. I am re-installing Bonmin
>> because it still shows errors although I already specified the Path. Can I
>> ask it is true that Bonmin contains Ipopt and so we only need to install
>> Bonmin with Cygwin 64?
>> >> >> >> >> >> >> >>> Best regards,
>> >> >> >> >> >> >> >>> TD Chuong
>> >> >> >> >> >> >> >>>
>> >> >> >> >> >> >> >>> On Wed, Sep 23, 2020 at 10:46 PM Giacomo Nannicini <
>> giacomo.n at gmail.com> wrote:
>> >> >> >> >> >> >> >>>>
>> >> >> >> >> >> >> >>>> Chuong,
>> >> >> >> >> >> >> >>>> please read the instruction manual, section 1.2.
>> You need Bonmin and
>> >> >> >> >> >> >> >>>> Ipopt to be in your system path, or otherwise you
>> need to specify
>> >> >> >> >> >> >> >>>> their location as options.
>> >> >> >> >> >> >> >>>>
>> >> >> >> >> >> >> >>>> Best,
>> >> >> >> >> >> >> >>>>
>> >> >> >> >> >> >> >>>> Giacomo
>> >> >> >> >> >> >> >>>>
>> >> >> >> >> >> >> >>>>
>> >> >> >> >> >> >> >>>>
>> >> >> >> >> >> >> >>>>
>> >> >> >> >> >> >> >>>> Dear the authors,
>> >> >> >> >> >> >> >>>> I am studying your interesting black-box function
>> packages (RBFOpt),
>> >> >> >> >> >> >> >>>> and I just install it with Bonmin-1.8.8 solver via
>> Cygwin64. I put
>> >> >> >> >> >> >> >>>> your "minimal working example" in Bonmin-1.8.8
>> folder and run it with
>> >> >> >> >> >> >> >>>> Pycharm. However, it shows errors
>> >> >> >> >> >> >> >>>> "pyutilib.common._exceptions.ApplicationError:
>> Solver (bonmin) not
>> >> >> >> >> >> >> >>>> available".
>> >> >> >> >> >> >> >>>>
>> >> >> >> >> >> >> >>>> Could you please let me know some your advice?
>> Thank you.
>> >> >> >> >> >> >> >>>> Best regards,
>> >> >> >> >> >> >> >>>> Chuong Thai Doan
>> >> >> >> >> >> >> >>>> Research Fellow at School of Information Technology
>> >> >> >> >> >> >> >>>> Deakin University, Melbourne, Australia
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://list.coin-or.org/pipermail/rbfopt/attachments/20200930/b2ce0767/attachment-0001.html>
More information about the RBFOpt
mailing list