-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Minimum working example #22
Comments
You can read help, by calling So, for your example, you can run reduction like this: |
That is helping a lot. I have just another question: Where can i select the particle species, e.g., |
could you send me error message or link to the file? it will help me a lot |
As discussed offline with @PrometheusPi , I tested now with an original file direct from PIC simulation. python reduction_main.py -hdf simData_orig_60000.h5 -hdf_re out.h5 -ratio_deleted_particles 0.5 -algorithm random I got the following output:
Speicherzugriffsfehler means seg fault. |
@KseniaBastrakova Under what conditions could |
it only values from input openPMD file, multiply on unit_SI or (for position) with adding positionOffset Can you send me a link to the fie? I will reproduce and find the bug |
@KseniaBastrakova for overing to test the file. 👍 |
@PrometheusPi this is the original file: /scratch/ws/1/s5960712-LPWFA_2020/runs_LWFA/010_LWFA_doping=0.005_Laguerre_k80/simOutput/h5/simData_beta_60000.h5 |
@alex-koe Thank you - surprisingly the file was not as big as expected - thus GridFTP is not really needed - I just copy it with |
The file can be fund on |
Thank you for example. Now, reduction works normally. But, I noticed that there aren't any fields in simData_beta_60000.h5 file. I ask it because it necessary for conversion: Also, i reproduced floating Segmentation fault. I think, problem can be in openPMD-api, not in file or in my code. (now, i"m working on localize) |
I added additional parameter ""-grid_size"" to openPMD to gdf converter: |
@alex-koe Could you try the new option? |
@PrometheusPi i will see how much time i have during beam time. |
@KseniaBastrakova @PrometheusPi somehow, i cannot make it work. $> python reduction_main.py -hdf_re reduced_60000.h5 -hdf simData_orig_60000.h5 -grid_size 0.00008 -ratio_deleted_particles 0.003 -algorithm random` and got ```usage: reduction_main.py [-h] [-algorithm algorithm] [-hdf hdf_file]
[-hdf_re hdf_file_reduction]
[-ratio_deleted_particles ratio_deleted_particles]
[-momentum_tol tolerance_momentum]
[-position_lol tolerance_position]
[-leveling_coefficient leveling_coefficient]
reduction_main.py: error: unrecognized arguments: -grid_size 0.00008 and i did not found in any file the word 2nd: I run $> python reduction_main.py -hdf_re reduced_60000.h5 -hdf simData_orig_60000.h5 -ratio_deleted_particles 0.003 -algorithm random
Series constructor called with explicit iteration suggests loading a single file with groupBased iteration encoding. Loaded file is fileBased. And there was no file generated. :-( |
Oh, sorry for the misunderstanding! I just reproduced full user case (reduction + conversion) and noticed, that is necessary parameter for my GDFconverter ( https://github.com/ComputationalRadiationPhysics/openPMD-converter-GDF ), so i added this parameter to converter, not to particle reduction. |
I think, I understood your problem: So, first, do: And next, try to modify |
Thanks for your detailed description. I pulled the new version of your code. It worked well and the file was written. How can i open the file "reduced_%T.h5"? I did two things:
|
Which h5py version do you have? I tried:
and get: could you try to open result file with openPMD-api?
|
Thanks a lot @KseniaBastrakova. It is now working! $> python reduction_main.py -hdf_re reduced_60000.h5 -hdf simData_orig_60000.h5 -ratio_deleted_particles 0.003 -algorithm random 2&>1 | tee output.txt However, it ended without any output at all in 'output.txt', because all messages were not written into the file but only buffered. This means that even no indication of exceeding taurus runtime was left. |
To be complete, this is the submit script on taurus i used: #!/bin/bash
#SBATCH --time=02:00:00
#SBATCH --job-name=part-red
#SBATCH --nodes=1
#SBATCH --mem-per-cpu=4000
#SBATCH --ntasks=2
# send me mails on BEGIN, END, FAIL, REQUEUE, ALL,
# TIME_LIMIT, TIME_LIMIT_90, TIME_LIMIT_80 and/or TIME_LIMIT_50
#SBATCH --mail-type=ALL
#SBATCH [email protected]
export PYTHONPATH="$HOME/anaconda3/"
export PATH="$PYTHONPATH/bin:$PATH"
python reduction_main.py -hdf simData_orig_%T.h5 -hdf_re simData_for_GPT_%T.h5 -iteration 60000 -ratio_deleted_particles 0.9970685060813841 -algorithm random
It completed after < 30min. XXX has to be replaced for a working address in case someone copies the code. |
Is there a minimum working example of the particle reduction? I tried to get the reduction_mail.py run with the following bash line:
There is no output from the program i could post. The output file out.h5 seems to be not written. I guess i did not correctly call the program.
So, how do i have to run the program? :-)
The text was updated successfully, but these errors were encountered: