Difference: PeterWinterShrubProcessing (11 vs. 12)

Revision 122011-04-22 - PeterWinter

Line: 1 to 1
 
META TOPICPARENT name="PeterWinterAnalysis"
Changed:
<
<
-- PeterWinter - 16 Feb 2009

This page will keep a description of how I use to create the final lifetime histograms from the output shrubs from ./mta....

Subdividing the data sets into groups of same magnet setting

First thing is to separate the runs into data sets according to Sara's classification scheme. One important step for that is to sort them into according groups of magnet settings (mainly +125A and -125A). That can be done from the MySQL database. The extracted data set files are at ~/jobscripts/lists/datasets. The files in there are named prod_rn10_mu+_a_125.txt or prod_rn11_mu-_c.txt and comprise a list of run numbers that belong to this data set.

Creating the links

After the mta has produced the intermediate shrubs, it might be necessary to create appropriate links to the data sets. The original trees live usually on Abe in /cfs/projects/lqo/bkiburg/run11_4/treepass2/mta/, where treepass2 must be replaced with the according production that you are interested (currently ProdMay09). Then, to create the links for each of the data sets, I have a macro in /u/ac/pwinter/jobscripts called MakeLinksForDatasets.pl. It will make the links according to the run numbers that are listed in the current $datasetfile = $HOME."/jobscripts/lists/datasets/".$dataset.".txt"; . To for example create the links for run 10 data set a_125 for mu+, you have to run:

> ./MakeLinksForDatasets.pl 11 prod_rn11_mu+_a_125 Pixel_;

Here the argument 11 denotes that this is run 11. The second argument is the data set name and the last states, that you want to do this for the Pixel shrubs, i.e. the Pixel_treeNNNNN.root files. You might need to adapt some directory paths in the script to your needs.

Merging the shrubs on the nodes and transfering to NPL

I have combined these steps into one macro. The main macro to be called is at NCSA the ~/jobscripts/ncsa_jobscript_merge.pl. Before you run it, you might need to again adapt some of the directory variables in there as well as the login at NPL for the scp command. After that, you can run it like:

> ./ncsa_jobscript_merge.pl prod_rn11_mu+_a_125 8 11 Pixel_;

Here, the first argument is the data set name, the second the numebr of processes ran at a time. The third argument is the PSI run number and the last gives the prfix for the ROOT file names. You can either submit this to the nodes or run it on an interactive machine since the processing time is quite short. The merged tree files will then be copied over to NPL in case that you specified the $remote_dir variable in the perl script.

Conversion to lifetime histograms with the grid engine at NPL

With the new grid engine on NPL, I have a new perl script in ~/jobscripts/ncsa_jobscript_createhistograms.pl which looks for the files of type mergedNNNNN-NNNNN.root in the source directory and launches the CreateHistogramsFromTreeWithLoop.C for each of them. In order to run it you might again adapt some of the directory variables first. To submit it then to the grid engine, I use:

> qsub -q pion.2g -V CreateHistogram.sh prod_rn11_mu+_a_125 11;

Here,

  • pion.2g is the chosen grid engine queue
  • prod_rn11_mu+_a_125 is the name of the current data_set to be processed
  • 11 is the PSI run number.

The CreateHistogramsFromTreeWithLoop.C will mainly produce various TH3 histogams with the axis:

  • x-axis: Time dt (electron-muon)
  • y-axis: Some parameter of your choice like gondola number, nContEH, extraEL (Depending on what you want to do, you need to create other histograms with the y-axis being adapted to the parameter of interest.)
  • z-axis: TPC subvolume number

Fitting the data using the class TAllScans

See eLog:224 for the description of that. Once you have several fits from the TAllScans class in a directory, you can easily create a HTML overview page using the mu/src/uiuc/macros/LtFit/MakeOverviewHTML.pl macro. It simply takes the directory as an argument in which it looks for all subdirectories of name 'dataset_' and creates links to all the scans in these.
>
>
Please go to https://muon.npl.washington.edu/twiki/bin/view/Main/PeterWinterShrubProcessing
 \ No newline at end of file
 
This site is powered by the TWiki collaboration platform Powered by PerlCopyright © 2008-2019 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback