Bill Allombert on Sun, 04 Dec 2022 21:44:35 +0100
|
[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]
Re: help with MPI and SLURM
|
- To: pari-users@pari.math.u-bordeaux.fr
- Subject: Re: help with MPI and SLURM
- From: Bill Allombert <Bill.Allombert@math.u-bordeaux.fr>
- Date: Sun, 4 Dec 2022 21:43:20 +0100
- Arc-authentication-results: i=1; smail; arc=none
- Arc-message-signature: i=1; a=rsa-sha256; d=math.u-bordeaux.fr; s=openarc; t=1670186599; c=relaxed/relaxed; bh=jyEiC5jWKVoj4mzfCHYB/5xYkaki2ch5DNK7n9iKfow=; h=DKIM-Signature:Date:From:To:Subject:Message-ID:Mail-Followup-To: References:MIME-Version:Content-Type:Content-Disposition: In-Reply-To; b=IKPJeRVSwy6KL2K/VC03aJu6PGfs+FeopqfAWo1Ra5/H2UZGMwXaCfiqA/HtJR0veZBJHzcuS26bB99CUaSPm70CTr+uaBaHHH7nq1bp6hGohDd0Jbrnayn8lQGopazOuWRpzM2q668MRpY2wRM4AQ+YRix4oCWc3828mvXEW/9g4tNM0XtT/zwlsUIP90xWeL1zPiYIbiDXHOBbKUKCUhXuJDmkjjUYQLHau7rOV9AMtwVSuORf0wiZBmwBF8niQT7T20c9yjfOb3CwAACrnovrqPraSfvSoR4KJuC5iLVV/ZRssBCqRuZRqQuwGURMT2BgaqAUjL4y6TTv8iTG8OYWSST6Gr8OBP3n0vLBnX74ucronnrwTCJCyjVZbZATiTxvPtuaXwg1/fqmZgO/fXM+WG03xfTR36BGPDFCMloLc7PA8JX4WdeBuOrpuK7oRcYTeRYY/FQi5m/SWE3USQdyNjObn86Kd0JArqmYwntAzhwzMDlMCzE5tQbDLXsusAT1hsPoqZEwEkZktHHqSreQ7VmZpJ1B9iuPtefHPydHj/KEAd8n4Fmtmlx+/HCucpQvFowfLlxWVGJoWE/T8a0uA7DPia+tBmNOv3MJ5qComX6o0dnU198xE93noQ4mkKsAYqysNydaz0h5CkfvVrnTW8eOyzqogo63xbspx1w=
- Arc-seal: i=1; a=rsa-sha256; d=math.u-bordeaux.fr; s=openarc; t=1670186599; cv=none; b=iw59yJnEP57ObLzG3fstsHG6dzRANo2FAEEQ1oz5fS/H4ts3EO9XW60fnol55c7C0/X0YAbRqtUQfdYYoMhMsG6BRc/LDTFQawvdCIDB+s24yrVWA3I7UiuJ7hLfohJPHZ6hsLqdB/tAZG4oWxz8llfj7M/LgEFpG1vfdQLIkb4ktAx01tdf/WqE2JCJWORwx4eq4EOYaRWQ4zfCYSZsMtmkSzwzm2eehusCJoS5j/eqqPQf8oq2Px5k01PoF49kTYz0WSZwMAEbqK9pSxFqvnn6qwzchX+0dsIk1+I2N5Engox89uJ8eFgw0WIT4wCW9WblVqa0CeDb3ehCrakUsapEqzsyGKK6uIwYbx3MfJnLFjWrHkMXZQEQSgY5P9RTwqNYtISZmRDMkjAt6/rqRdPbEeoujbyzpuCMxNRFX3h4HBz7wG8qE2hDbIgy0N2KdaLTb4H6dmCkNvuftuQyzu7WSm/oVs2NvKHJPy95FE701VBVtuzyzVWgZ7PyPcgFHLlvmNrJpIFbINm23o4+4J7JQQnCL4+apKCooVVqfP/fE/d6UJlc3aCuDdqUrSFojKlDXdKAoypI7rRWwoAWAkMiMQ4S4F0gvT8M/6nLHtSvaLLUzp1RCvTJxGJau/xZbN7LGT2NAZ0novh51khFyBzFWVz9LVxjvi6FsUNPWkE=
- Authentication-results: smail; dmarc=none header.from=math.u-bordeaux.fr
- Authentication-results: smail; arc=none
- Delivery-date: Sun, 04 Dec 2022 21:44:35 +0100
- Dkim-signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=math.u-bordeaux.fr; s=2022; t=1670186599; bh=jyEiC5jWKVoj4mzfCHYB/5xYkaki2ch5DNK7n9iKfow=; h=Date:From:To:Subject:References:In-Reply-To:From; b=dqVEUVR7rYIEbHhPfXMK90sM27KxvfLOBXnqlvlynN29Ay3Ejr9Ia0bqx1mygsmoZ oSKIKafgT8lK7tRkrSXFreSG+9V4wjcgbNgtWgtQQNsg7LI0c9iOWaBZZUmTr2DUXS jHXKV+O9QLvW3cYz6v7THuAnjfE+FBW5jdmtwp1gT+i6OtzMvm/mxn9OmMPEmtd3/h b6JgXwZaEtgEEcQSxQtnbP6mkGJlV/qsn3dbEHM8H3qmr5nFez3FrzVvcJKEKufNIB jrVApUapsKyeqVHDzXo4x97SgMucekyATofjSodo5eWBOxQWqpaKqokjiIzkdOsf8Q eHsmGvjUMVKMjEUyJxmmWu7uMeYF4AzbGbhxejZZ2fWSBGLw3GamR/i8D0Bq5Lsctg FmyIEogKuTUIh5DXVmOPkaq5tq6KnLwwNxlYNweCui7hieqdp8BXdnVrjZ9P5f+QBo IjxwvmWuNliiBrBGp3gA5ibwIYvp8tXwMdfwbalqlHYEEC1GneuzzzIK2OfhKQYh0x oboDSAMuN6EIIzESWaBRLWD/N7r1YvonwLacrw2bqSr1ouw82eeTDbit4eHdWQSdQV fCnRc8LGpqIgA8y+YZI5WziDzpV1gSiFw/fmghnL8F5cwImqEdoW7ig+gHZyEROoUB OlYrUYdgdayzFEzzx6X4U4XE=
- In-reply-to: <2d6068e3-c36b-1138-b460-5b6c8ae490c5@ug.edu.pl>
- Mail-followup-to: pari-users@pari.math.u-bordeaux.fr
- References: <2d6068e3-c36b-1138-b460-5b6c8ae490c5@ug.edu.pl>
On Sun, Dec 04, 2022 at 09:12:34PM +0100, Markus Grassl wrote:
> Hello,
>
> I hope that someone here on the list can help me with the MPI version of
> PARI/gp in a SLURM environment.
I have acces to a SLURM environment with MPI. Let's just say I get it to work
when I need to use it, but it is always finicky, it depends on the way the
network is set up, different partitions might need different settings.
> The application appears to have been direct launched using "srun",
> but OMPI was not built with SLURM's PMI support and therefore cannot
> execute. There are several options for building PMI support under
> SLURM, depending upon the SLURM version you are using:
Looking at the error message, this is a problem with your OpenMPI setup, not with PARI.
Maybe your cluster has another version of mpi or openmpi that you can use.
Cheers,
Bill.