Use MC_RUNJOB at CCIN2P3 ( private production )
The available mc_runjob versions are obtained by the command :
The version tagged current is the one to use.
The full mc_runjob documentation is available here
This note is aimed at users who want to perform private MC production.
It will explain the interface and the modifications to mc_runjob as it is
implemented at the CCIN2P3.
In a few words, a set of macro templates is provided with a Python script to edit
these templates and to submit the request to the BQS batch system.
The outputfiles will be generally copied to HPSS, and the log and metadata files
will be returned to a user specified AFS directory.
All the needed parameters and all the actions are given in the mc_runjob macro.
Only a few parameters can be edited in the templates ( nb of events, inputfile name,
directory name ) by the submitting script. The other parameters are hardcoded.
Of course, you are free to change your own version of the templates.
The file with the available templates and scripts is available in :
- $THRONG_DIR/info_data/mcp14/mcrun_tools_vxx.yy.zz.tar.gz
where vxx.yy.zz correspond to the current version of mc_runjob.
It contains the following files :
- README explaining what has to be tailored in the templates for your own production.
-
Scripts
- SubmitMC.py the script to submit the MC request
- MacroTemplates.py correspondance between configuration, template and jobname
- MakeFileList.py a script which scans the job output directories to collect some info
about the output files (also useful to check the job outputs).
Templates
- single-gen-template.macro to generate single particles
- pythia-Zee-template.macro to generate Zee events with Pythia
- pythia-d0mess-template.macro to generate and filter events with Pythia and D0mess
- d0gstar-template.macro to run d0gstar on generated events
- d0sim-template.macro to run d0sim on d0gstar events
- d0reco-template.macro to run d0reco on simulated events
- d0sim-reco-template.macro to run d0sim and d0reco on d0gstar events (!!parameters to overlay MinBias events are hardcoded in template !!)
- Minbias-template.macro to generate and produce d0gstar minbias events
- merger-template.macro to merge thumbnails
- tmbanalyze-template.macro to produce Root trees from thumbnails
- tmbfixer-template.macro to fix thumbnails
- tmbfix-tmbana-template.macro to merge, fix thumbnails and produce Trees
How to create your MC production environment:
-
Create a working space, then uncompress and detar the file
- tar zxvf $THRONG_DIR/info_data/mcp17/mcrun_tools.tar.gz
-
Edit the templates according to the instructions in README file.
How to submit MC jobs:
- Your personal job counters will be recorded on the file ~/JobCounters
- setup mc_runjob vxx.yy.zz (the recommanded version for MC production)
- setup D0RunII xx.yy.zz ( version number does not matter here, but this is the version
to use for MC production.
- Now get acquainted with SubmitMC.py script by typing :
./SubmitMC.py -help
- the required parameter -config= refers to a given template, the names are self
explaining, but you can have a look at MacroTemplates.py to see the correspondance configuration-template-jobprefix. Of course, you can add your
own entry to this dictionary.
- The other required parameter is -copybackdir= where you give the directory name
where you want to save the log and script files.
- So the general command is :
./SubmitMC.py -config=cccc -copybackdir=ddddd -numrecords=xx -submit='submit options'
Without the submit option the job will be prepared but not submitted (may be useful for
an interactive test).
Some examples:
- Generator step :
Run a single generator job with a large nb of events ( say 10K)
./SubmitMC.py -config=pythia-Zee -numrecords=10000 -repeat=1 ....
- D0gstar step :
Run as many d0gstar jobs as required to process these events, taking into account
that a 500 event job takes more than 1M seconds ( class T ).
./SubmitMC.py -config=d0gstar -inputfilename="MyGenFile" -numrecords=500 -repeat=20 ...
Jobs can be submitted in several commands with the -skiprecords= parameter.
- D0sim + D0reco step :
When all the d0gstar jobs are finished, run the same number of d0sim+d0reco jobs
./SubmitMC.py -config=d0sim-reco -inputfilelist="MyD0gstarfiles" -numrecords=500 -repeat=20
The "MyD0gstarfiles" can be created with the provided script MakeFilelist.py.
The assumption here is "one job processes one input file".
Jobs can be submitted in several commands using the -skipfiles= parameter.
- One can then merge the "small size thumbnail files
./SubmitMC.py -config=merge -mergefilelist="MyTMBfiles" -metadatalist="MyMetadataTMBlist" ....
the metadatalist can be created by a command like :
find `pwd` -name 'import_kw_tmb*.py' | grep job_output
This list should contain the full pathname of the metadata files describing the thumbnail files
otherwise the worker running the job will not find them.
Also, the files in both lists should be ordered in increasing event number.
This can be achieved with a minimum of effort with the following recommendations :
- Wait for all d0gstar to finish
- Submit the d0sim and d0reco steps in one job ( d0sim-reco-template.macro)
using the MakeFileList.py script to get the list of d0gstar files, and sort
the obtained list in increasing d0gstar job number
- Submit the merger job after sorting the thumbnail and metadata files
according to the d0sim-reco job number
A final word
Space is very limited on the D0 GROUP_DIR directories.
So, when your production is ready to be used, please concatenate your working directory,
compress and archive the resulting file with help of
elliotc.
elliotc archive -G y -c "comment" YourTarGzfile
You may have to run Elliotc on another platform (IBM or SUN) since it
is not fully available on the Linux platfom.
Michel Jaffre
Last
modified: Wed Oct 27 17:35:00 GMT+1 DST 2005