Return-Path: Received: from [149.220.60.110] (account huebl@hzdr.de [149.220.60.110] verified) by hzdr.de (CommuniGate Pro SMTP 6.1.16) with ESMTPSA id 17387036 for picongpu-users@hzdr.de; Wed, 24 May 2017 11:04:33 +0200 Subject: Re: recommended OS for development To: picongpu-users@hzdr.de References: From: Axel Huebl X-Enigmail-Draft-Status: N1110 Organization: HZDR Message-ID: <15a71826-ee91-af3b-5b51-be2f29848b83@hzdr.de> Date: Wed, 24 May 2017 11:04:33 +0200 MIME-Version: 1.0 In-Reply-To: Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: 7bit Hi Andrei, thank you for the questions! Yes of course we will assist you to get started. For comparisons, it might be best to do those on a cluster, even if you just use one node in an interactive session. You might also want to take the upcoming 0.3.0 release for that, since we improved many aspects of the code again. Feel free to contact us performance wise, e.g. setting the correct GPU compute architecture during `pic-configure` will be important to get optimal performance. We also operate on Slurm clusters regularly and have a new section on it in our manual under: http://picongpu.readthedocs.io/en/dev/usage/tbg.html#batch-system-examples You might just want to clone the tbg examples for slurm from the TU Dresden cluster "Taurus", located here: https://github.com/ComputationalRadiationPhysics/picongpu/tree/dev/src/picongpu/submit/taurus-tud instead of modifying the quite different PBS submit scripts. If you want to try PIConGPU on a laptop, I would recommend trying setting it up via nivida-docker since I saw several people struggling in the last weeks to get their individual software environments on Desktops in a usable state (it's also easier to support you since it automatically creates reproducible environments that you can share with us via Dockerfiles). Axel On 24.05.2017 10:19, Andrei Berceanu wrote: > Dear Axel, > > Thank you for the warm welcome and your detailed reply! While I am quite new to PIC codes, I have done scientific computing in the past and I now work on laser-matter interaction at ELI-NP [1], where we are currently evaluating the possibility of using PIConGPU instead of EPOCH. In fact, my previous question about OS/software stack was because I want to install a development version of PIConGPU on my personal laptop [2] for some benchmarks against EPOCH and for playing around with the code. > > On a separate note, one of my colleagues managed to install PIConGPU on a cluster, but he told me all the examples he could find in the docs refer to the PBS batch system, while the batch system on this particular cluster is Slurm. Are you aware of any examples of using PIConGPU with Slurm? > Thanks, > > Andrei > > [1] http://www.eli-np.ro > [2] https://www.asus.com/ROG-Republic-Of-Gamers/ROG-GL552JX/specifications/ > > On Sat, 20 May 2017 17:10:34 +0200, Axel Huebl wrote: > >> Hi Andrei, >> >> Welcome to our list! Most of us devel on a Debian derivative, e.g. Debian testing, latest release of Ubuntu or Mint. Most of our runtime testing we do in interactive queues on various clusters. >> >> We don't have a vagrant setup yet but in case you like docker, we have a nvidia-docker file under: >> https://github.com/ComputationalRadiationPhysics/picongpu/issues/829 >> >> If you are an expert in vagrant, please don't hesitate to share you setup. I just did not find the time to check how well GPU support works with vagrant and if people are using it regularly. I know it exists but have not seen other scientists working with it yet. >> >> Package wise we usually need a rather recent version of cuda, cmake and boost with a working mpi installation, see our install requirements for exact versions : >> https://picongpu.readthedocs.io/en/dev/install/dependencies.html >> >> An attractive and brand new alternative is also spack, which has working build receipts for all our dependencies : >> https://github.com/LLNL/spack >> https://spack.io >> >> Are you just generally interested or developing with a specific lab/university? If you need any help to get started or have a specific development you are aiming for, feel free to open threads here or issues on github. >> >> Again, welcome! :) >> >> Cheers, >> Axel > > > > ############################################################# > This message is sent to you because you are subscribed to > the mailing list . > To unsubscribe, E-mail to: > To switch to the DIGEST mode, E-mail to > To switch to the INDEX mode, E-mail to > Send administrative queries to > -- Axel Huebl Phone +49 351 260 3582 https://www.hzdr.de/crp Computational Radiation Physics Laser Particle Acceleration Division Helmholtz-Zentrum Dresden - Rossendorf e.V. Bautzner Landstrasse 400, 01328 Dresden POB 510119, D-01314 Dresden Vorstand: Prof. Dr.Dr.h.c. R. Sauerbrey Prof. Dr.Dr.h.c. P. Joehnk VR 1693 beim Amtsgericht Dresden