[Ipopt] Build failure (during 'make test') on Debian Sid
Roberto C. Sánchez
roberto at connexer.com
Mon Oct 19 13:47:02 EDT 2009
Naturally, as is my habit, I have forgotten the attachments the first
time. They are attached this time.
On Mon, Oct 19, 2009 at 01:29:03PM -0400, Roberto C. Sánchez wrote:
> Hi,
>
> I am trying to package Ipopt-3.7.1 for Debian. I am building on Debian
> Sid (amd64, specifically), using the following commands:
>
> CC=/usr/bin/mpicc MPICC=/usr/bin/mpicc CXX=/usr/bin/mpic++ MPICXX=/usr/bin/mpic++ F77=/usr/bin/mpif77 MPIF77=/usr/bin/mpif77 ../configure --prefix=/tmp/ipopt --with-blas="-L/usr/lib -lblas" --with-lapack="-L/usr/lib -llapack" --without-asldir --without-pardiso --without-wsmp -with-mumps-incdir="/usr/include" --with-mumps-lib="-L/usr/lib -ldmumps"
>
> make CXXFLAGS="-I../../../../Ipopt/src/Common -I../../../../Ipopt/src/LinAlg -I../../../../Ipopt/src/LinAlg/TMatrices -I../../../../Ipopt/src/Algorithm -I../../../../Ipopt/src/Algorithm/LinearSolvers -I../../../../Ipopt/src/contrib/CGPenalty" CC=/usr/bin/mpicc MPICC=/usr/bin/mpicc CXX=/usr/bin/mpic++ MPICXX=/usr/bin/mpic++ F77=/usr/bin/mpif77 MPIF77=/usr/bin/mpif77
>
> make test
>
> The configure and build steps themselves go fine. However, the test
> step fails (I have attached the output). I'd be interested to know if
> this is something to worry about, or if anyone can provide any
> suggestions.
>
> Also, please note that I had apply the attached patch in order for it to
> build. The libraries that I am using to build against are: g++ gfortran
> libblas-dev libmumps-dev liblapack-dev.
>
> Regards,
>
> -Roberto
>
> --
> Roberto C. Sánchez
> http://people.connexer.com/~roberto
> http://www.connexer.com
> _______________________________________________
> Ipopt mailing list
> Ipopt at list.coin-or.org
> http://list.coin-or.org/mailman/listinfo/ipopt
--
Roberto C. Sánchez
http://people.connexer.com/~roberto
http://www.connexer.com
-------------- next part --------------
Running unitTests...
Testing AMPL Solver Executable...
no AMPL solver executable found, skipping test...
Testing C++ Example...
---- 8< ---- Start of test program output ---- 8< ----
[quito:30623] [[INVALID],INVALID] ORTE_ERROR_LOG: Not found in file ../../../../../../orte/mca/ess/hnp/ess_hnp_module.c at line 130
--------------------------------------------------------------------------
It looks like orte_init failed for some reason; your parallel process is
likely to abort. There are many reasons that a parallel process can
fail during orte_init; some of which are due to configuration or
environment problems. This failure appears to be an internal failure;
here's some additional information (which may only be relevant to an
Open MPI developer):
orte_plm_base_select failed
--> Returned value Not found (-13) instead of ORTE_SUCCESS
--------------------------------------------------------------------------
[quito:30623] [[INVALID],INVALID] ORTE_ERROR_LOG: Not found in file ../../../orte/runtime/orte_init.c at line 132
--------------------------------------------------------------------------
It looks like orte_init failed for some reason; your parallel process is
likely to abort. There are many reasons that a parallel process can
fail during orte_init; some of which are due to configuration or
environment problems. This failure appears to be an internal failure;
here's some additional information (which may only be relevant to an
Open MPI developer):
orte_ess_set_name failed
--> Returned value Not found (-13) instead of ORTE_SUCCESS
--------------------------------------------------------------------------
[quito:30623] [[INVALID],INVALID] ORTE_ERROR_LOG: Not found in file ../../../orte/orted/orted_main.c at line 323
[quito:30603] [[INVALID],INVALID] ORTE_ERROR_LOG: Unable to start a daemon on the local node in file ../../../../../../orte/mca/ess/singleton/ess_singleton_module.c at line 381
[quito:30603] [[INVALID],INVALID] ORTE_ERROR_LOG: Unable to start a daemon on the local node in file ../../../../../../orte/mca/ess/singleton/ess_singleton_module.c at line 143
[quito:30603] [[INVALID],INVALID] ORTE_ERROR_LOG: Unable to start a daemon on the local node in file ../../../orte/runtime/orte_init.c at line 132
--------------------------------------------------------------------------
It looks like orte_init failed for some reason; your parallel process is
likely to abort. There are many reasons that a parallel process can
fail during orte_init; some of which are due to configuration or
environment problems. This failure appears to be an internal failure;
here's some additional information (which may only be relevant to an
Open MPI developer):
orte_ess_set_name failed
--> Returned value Unable to start a daemon on the local node (-128) instead of ORTE_SUCCESS
--------------------------------------------------------------------------
--------------------------------------------------------------------------
It looks like MPI_INIT failed for some reason; your parallel process is
likely to abort. There are many reasons that a parallel process can
fail during MPI_INIT; some of which are due to configuration or environment
problems. This failure appears to be an internal failure; here's some
additional information (which may only be relevant to an Open MPI
developer):
ompi_mpi_init: orte_init failed
--> Returned "Unable to start a daemon on the local node" (-128) instead of "Success" (0)
--------------------------------------------------------------------------
*** An error occurred in MPI_Init
*** before MPI was initialized
*** MPI_ERRORS_ARE_FATAL (your MPI job will now abort)
[quito:30603] Abort before MPI_INIT completed successfully; not able to guarantee that all other processes were killed!
---- 8< ---- End of test program output ---- 8< ----
******** Test FAILED! ********
Output of the test program is above.
Testing C Example...
---- 8< ---- Start of test program output ---- 8< ----
[quito:30641] [[INVALID],INVALID] ORTE_ERROR_LOG: Not found in file ../../../../../../orte/mca/ess/hnp/ess_hnp_module.c at line 130
--------------------------------------------------------------------------
It looks like orte_init failed for some reason; your parallel process is
likely to abort. There are many reasons that a parallel process can
fail during orte_init; some of which are due to configuration or
environment problems. This failure appears to be an internal failure;
here's some additional information (which may only be relevant to an
Open MPI developer):
orte_plm_base_select failed
--> Returned value Not found (-13) instead of ORTE_SUCCESS
--------------------------------------------------------------------------
[quito:30641] [[INVALID],INVALID] ORTE_ERROR_LOG: Not found in file ../../../orte/runtime/orte_init.c at line 132
--------------------------------------------------------------------------
It looks like orte_init failed for some reason; your parallel process is
likely to abort. There are many reasons that a parallel process can
fail during orte_init; some of which are due to configuration or
environment problems. This failure appears to be an internal failure;
here's some additional information (which may only be relevant to an
Open MPI developer):
orte_ess_set_name failed
--> Returned value Not found (-13) instead of ORTE_SUCCESS
--------------------------------------------------------------------------
[quito:30641] [[INVALID],INVALID] ORTE_ERROR_LOG: Not found in file ../../../orte/orted/orted_main.c at line 323
[quito:30627] [[INVALID],INVALID] ORTE_ERROR_LOG: Unable to start a daemon on the local node in file ../../../../../../orte/mca/ess/singleton/ess_singleton_module.c at line 381
[quito:30627] [[INVALID],INVALID] ORTE_ERROR_LOG: Unable to start a daemon on the local node in file ../../../../../../orte/mca/ess/singleton/ess_singleton_module.c at line 143
[quito:30627] [[INVALID],INVALID] ORTE_ERROR_LOG: Unable to start a daemon on the local node in file ../../../orte/runtime/orte_init.c at line 132
--------------------------------------------------------------------------
It looks like orte_init failed for some reason; your parallel process is
likely to abort. There are many reasons that a parallel process can
fail during orte_init; some of which are due to configuration or
environment problems. This failure appears to be an internal failure;
here's some additional information (which may only be relevant to an
Open MPI developer):
orte_ess_set_name failed
--> Returned value Unable to start a daemon on the local node (-128) instead of ORTE_SUCCESS
--------------------------------------------------------------------------
--------------------------------------------------------------------------
It looks like MPI_INIT failed for some reason; your parallel process is
likely to abort. There are many reasons that a parallel process can
fail during MPI_INIT; some of which are due to configuration or environment
problems. This failure appears to be an internal failure; here's some
additional information (which may only be relevant to an Open MPI
developer):
ompi_mpi_init: orte_init failed
--> Returned "Unable to start a daemon on the local node" (-128) instead of "Success" (0)
--------------------------------------------------------------------------
*** An error occurred in MPI_Init
*** before MPI was initialized
*** MPI_ERRORS_ARE_FATAL (your MPI job will now abort)
[quito:30627] Abort before MPI_INIT completed successfully; not able to guarantee that all other processes were killed!
---- 8< ---- End of test program output ---- 8< ----
******** Test FAILED! ********
Output of the test program is above.
Testing Fortran Example...
---- 8< ---- Start of test program output ---- 8< ----
[quito:30659] [[INVALID],INVALID] ORTE_ERROR_LOG: Not found in file ../../../../../../orte/mca/ess/hnp/ess_hnp_module.c at line 130
--------------------------------------------------------------------------
It looks like orte_init failed for some reason; your parallel process is
likely to abort. There are many reasons that a parallel process can
fail during orte_init; some of which are due to configuration or
environment problems. This failure appears to be an internal failure;
here's some additional information (which may only be relevant to an
Open MPI developer):
orte_plm_base_select failed
--> Returned value Not found (-13) instead of ORTE_SUCCESS
--------------------------------------------------------------------------
[quito:30659] [[INVALID],INVALID] ORTE_ERROR_LOG: Not found in file ../../../orte/runtime/orte_init.c at line 132
--------------------------------------------------------------------------
It looks like orte_init failed for some reason; your parallel process is
likely to abort. There are many reasons that a parallel process can
fail during orte_init; some of which are due to configuration or
environment problems. This failure appears to be an internal failure;
here's some additional information (which may only be relevant to an
Open MPI developer):
orte_ess_set_name failed
--> Returned value Not found (-13) instead of ORTE_SUCCESS
--------------------------------------------------------------------------
[quito:30659] [[INVALID],INVALID] ORTE_ERROR_LOG: Not found in file ../../../orte/orted/orted_main.c at line 323
[quito:30645] [[INVALID],INVALID] ORTE_ERROR_LOG: Unable to start a daemon on the local node in file ../../../../../../orte/mca/ess/singleton/ess_singleton_module.c at line 381
[quito:30645] [[INVALID],INVALID] ORTE_ERROR_LOG: Unable to start a daemon on the local node in file ../../../../../../orte/mca/ess/singleton/ess_singleton_module.c at line 143
[quito:30645] [[INVALID],INVALID] ORTE_ERROR_LOG: Unable to start a daemon on the local node in file ../../../orte/runtime/orte_init.c at line 132
--------------------------------------------------------------------------
It looks like orte_init failed for some reason; your parallel process is
likely to abort. There are many reasons that a parallel process can
fail during orte_init; some of which are due to configuration or
environment problems. This failure appears to be an internal failure;
here's some additional information (which may only be relevant to an
Open MPI developer):
orte_ess_set_name failed
--> Returned value Unable to start a daemon on the local node (-128) instead of ORTE_SUCCESS
--------------------------------------------------------------------------
--------------------------------------------------------------------------
It looks like MPI_INIT failed for some reason; your parallel process is
likely to abort. There are many reasons that a parallel process can
fail during MPI_INIT; some of which are due to configuration or environment
problems. This failure appears to be an internal failure; here's some
additional information (which may only be relevant to an Open MPI
developer):
ompi_mpi_init: orte_init failed
--> Returned "Unable to start a daemon on the local node" (-128) instead of "Success" (0)
--------------------------------------------------------------------------
*** An error occurred in MPI_Init
*** before MPI was initialized
*** MPI_ERRORS_ARE_FATAL (your MPI job will now abort)
[quito:30645] Abort before MPI_INIT completed successfully; not able to guarantee that all other processes were killed!
---- 8< ---- End of test program output ---- 8< ----
******** Test FAILED! ********
Output of the test program is above.
exit: 109: Illegal number: -1
-------------- next part --------------
A non-text attachment was scrubbed...
Name: extern_mangle_fix.patch
Type: text/x-diff
Size: 928 bytes
Desc: not available
Url : http://list.coin-or.org/pipermail/ipopt/attachments/20091019/db6b8846/attachment.bin
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 197 bytes
Desc: Digital signature
Url : http://list.coin-or.org/pipermail/ipopt/attachments/20091019/db6b8846/attachment-0001.bin
More information about the Ipopt
mailing list