Difference: PeterWinterShrubProcessing (1 vs. 12)

Revision 122011-04-22 - PeterWinter

Line: 1 to 1
 
META TOPICPARENT name="PeterWinterAnalysis"
Changed:
<
<
-- PeterWinter - 16 Feb 2009

This page will keep a description of how I use to create the final lifetime histograms from the output shrubs from ./mta....

Subdividing the data sets into groups of same magnet setting

First thing is to separate the runs into data sets according to Sara's classification scheme. One important step for that is to sort them into according groups of magnet settings (mainly +125A and -125A). That can be done from the MySQL database. The extracted data set files are at ~/jobscripts/lists/datasets. The files in there are named prod_rn10_mu+_a_125.txt or prod_rn11_mu-_c.txt and comprise a list of run numbers that belong to this data set.

Creating the links

After the mta has produced the intermediate shrubs, it might be necessary to create appropriate links to the data sets. The original trees live usually on Abe in /cfs/projects/lqo/bkiburg/run11_4/treepass2/mta/, where treepass2 must be replaced with the according production that you are interested (currently ProdMay09). Then, to create the links for each of the data sets, I have a macro in /u/ac/pwinter/jobscripts called MakeLinksForDatasets.pl. It will make the links according to the run numbers that are listed in the current $datasetfile = $HOME."/jobscripts/lists/datasets/".$dataset.".txt"; . To for example create the links for run 10 data set a_125 for mu+, you have to run:

> ./MakeLinksForDatasets.pl 11 prod_rn11_mu+_a_125 Pixel_;

Here the argument 11 denotes that this is run 11. The second argument is the data set name and the last states, that you want to do this for the Pixel shrubs, i.e. the Pixel_treeNNNNN.root files. You might need to adapt some directory paths in the script to your needs.

Merging the shrubs on the nodes and transfering to NPL

I have combined these steps into one macro. The main macro to be called is at NCSA the ~/jobscripts/ncsa_jobscript_merge.pl. Before you run it, you might need to again adapt some of the directory variables in there as well as the login at NPL for the scp command. After that, you can run it like:

> ./ncsa_jobscript_merge.pl prod_rn11_mu+_a_125 8 11 Pixel_;

Here, the first argument is the data set name, the second the numebr of processes ran at a time. The third argument is the PSI run number and the last gives the prfix for the ROOT file names. You can either submit this to the nodes or run it on an interactive machine since the processing time is quite short. The merged tree files will then be copied over to NPL in case that you specified the $remote_dir variable in the perl script.

Conversion to lifetime histograms with the grid engine at NPL

With the new grid engine on NPL, I have a new perl script in ~/jobscripts/ncsa_jobscript_createhistograms.pl which looks for the files of type mergedNNNNN-NNNNN.root in the source directory and launches the CreateHistogramsFromTreeWithLoop.C for each of them. In order to run it you might again adapt some of the directory variables first. To submit it then to the grid engine, I use:

> qsub -q pion.2g -V CreateHistogram.sh prod_rn11_mu+_a_125 11;

Here,

  • pion.2g is the chosen grid engine queue
  • prod_rn11_mu+_a_125 is the name of the current data_set to be processed
  • 11 is the PSI run number.

The CreateHistogramsFromTreeWithLoop.C will mainly produce various TH3 histogams with the axis:

  • x-axis: Time dt (electron-muon)
  • y-axis: Some parameter of your choice like gondola number, nContEH, extraEL (Depending on what you want to do, you need to create other histograms with the y-axis being adapted to the parameter of interest.)
  • z-axis: TPC subvolume number

Fitting the data using the class TAllScans

See eLog:224 for the description of that. Once you have several fits from the TAllScans class in a directory, you can easily create a HTML overview page using the mu/src/uiuc/macros/LtFit/MakeOverviewHTML.pl macro. It simply takes the directory as an argument in which it looks for all subdirectories of name 'dataset_' and creates links to all the scans in these.
>
>
Please go to https://muon.npl.washington.edu/twiki/bin/view/Main/PeterWinterShrubProcessing
 \ No newline at end of file

Revision 112009-08-18 - PeterWinter

Line: 1 to 1
 
META TOPICPARENT name="PeterWinterAnalysis"
-- PeterWinter - 16 Feb 2009
Line: 38 to 38
 

Fitting the data using the class TAllScans

See eLog:224 for the description of that.
Added:
>
>
Once you have several fits from the TAllScans class in a directory, you can easily create a HTML overview page using the mu/src/uiuc/macros/LtFit/MakeOverviewHTML.pl macro. It simply takes the directory as an argument in which it looks for all subdirectories of name 'dataset_' and creates links to all the scans in these.

Revision 102009-07-23 - PeterWinter

Line: 1 to 1
 
META TOPICPARENT name="PeterWinterAnalysis"
-- PeterWinter - 16 Feb 2009

This page will keep a description of how I use to create the final lifetime histograms from the output shrubs from ./mta....

Subdividing the data sets into groups of same magnet setting

Changed:
<
<
First thing is to separate the runs into data sets according to Sara's classification scheme. One important step for that is to sort them into according groups of magnet settings (mainly +125A and -125A). That can be done from the MySQL database. The extracted data set files are at ~/jobscripts/lists/datasets. The files in there are named prod_rn10_mu+_a_125 or prod_rn11_mu-_c and comprise a list of run numbers that belong to this data set.
>
>
First thing is to separate the runs into data sets according to Sara's classification scheme. One important step for that is to sort them into according groups of magnet settings (mainly +125A and -125A). That can be done from the MySQL database. The extracted data set files are at ~/jobscripts/lists/datasets. The files in there are named prod_rn10_mu+_a_125.txt or prod_rn11_mu-_c.txt and comprise a list of run numbers that belong to this data set.
 

Creating the links

After the mta has produced the intermediate shrubs, it might be necessary to create appropriate links to the data sets. The original trees live usually on Abe in /cfs/projects/lqo/bkiburg/run11_4/treepass2/mta/, where treepass2 must be replaced with the according production that you are interested (currently ProdMay09). Then, to create the links for each of the data sets, I have a macro in /u/ac/pwinter/jobscripts called MakeLinksForDatasets.pl. It will make the links according to the run numbers that are listed in the current $datasetfile = $HOME."/jobscripts/lists/datasets/".$dataset.".txt"; . To for example create the links for run 10 data set a_125 for mu+, you have to run:
Changed:
<
<
> ./MakeLinksForDatasets.pl 11 prod_rn11_mu+_a_125 Pixel_;
>
>
> ./MakeLinksForDatasets.pl 11 prod_rn11_mu+_a_125 Pixel_;
 Here the argument 11 denotes that this is run 11. The second argument is the data set name and the last states, that you want to do this for the Pixel shrubs, i.e. the Pixel_treeNNNNN.root files. You might need to adapt some directory paths in the script to your needs.

Merging the shrubs on the nodes and transfering to NPL

Changed:
<
<
I have combined these steps into one macro
>
>
I have combined these steps into one macro. The main macro to be called is at NCSA the ~/jobscripts/ncsa_jobscript_merge.pl. Before you run it, you might need to again adapt some of the directory variables in there as well as the login at NPL for the scp command. After that, you can run it like:

> ./ncsa_jobscript_merge.pl prod_rn11_mu+_a_125 8 11 Pixel_;

Here, the first argument is the data set name, the second the numebr of processes ran at a time. The third argument is the PSI run number and the last gives the prfix for the ROOT file names. You can either submit this to the nodes or run it on an interactive machine since the processing time is quite short. The merged tree files will then be copied over to NPL in case that you specified the $remote_dir variable in the perl script.

Conversion to lifetime histograms with the grid engine at NPL

With the new grid engine on NPL, I have a new perl script in ~/jobscripts/ncsa_jobscript_createhistograms.pl which looks for the files of type mergedNNNNN-NNNNN.root in the source directory and launches the CreateHistogramsFromTreeWithLoop.C for each of them. In order to run it you might again adapt some of the directory variables first. To submit it then to the grid engine, I use:

> qsub -q pion.2g -V CreateHistogram.sh prod_rn11_mu+_a_125 11;

Here,

  • pion.2g is the chosen grid engine queue
  • prod_rn11_mu+_a_125 is the name of the current data_set to be processed
  • 11 is the PSI run number.
 
Changed:
<
<

Conversion to lifetime histograms

  1. In order to create histograms, I have a script on NPL at ~/jobscripts/MakeHistogramExe.pl. If you run it, it requires thre input parameters:
    MakeHistogramExe.pl input_directory data_set_name prefix where
    • input_directory: Is the directory, where your shrub files are
    • data_set_name: It's the name of the current data_set
    • prefix: Is the beginning of your shrub file names before the run number, i.e. if your shrubs are Pixel_34536-36776.root, then prefix="Pixel_"
      If you run this script, it will loop over all the root files with this prefix and create a HistogramCreation _$data_set_name.pl file. You might need to adapt the script since it now expects shrub file names with a range of run numbers, i.e. Pixel_34536-36776.root rather than Pixel_34536.root! You should also edit the output directory in this MakeHistogramExe.pl to match your needs.
  2. You can now launch the HistCreate_$data_set_name.pl files from the step before. This script expects a single argument indicating which of the root files that were found in the step above to run, i.e. 0=first ROOT file, 1=second ROOT file etc.
    Launching can be done with a small condor job. Use the launch_condor example setup file from CVS and change necessary directories/paths in there to adapt to your needs.
    The HistCreate_$data_set_name.pl will then call a ROOT script with appropriate arguments. This CreateHistogramsFromTreeWithLoop.C will mainly produce various TH3 histogams with the axis:
>
>
The CreateHistogramsFromTreeWithLoop.C will mainly produce various TH3 histogams with the axis:
 
    • x-axis: Time dt (electron-muon)
    • y-axis: Some parameter of your choice like gondola number, nContEH, extraEL (Depending on what you want to do, you need to create other histograms with the y-axis being adapted to the parameter of interest.)
    • z-axis: TPC subvolume number

Fitting the data using the class TAllScans

Deleted:
<
<

Parameter scan for nContEH, nExtraEL

Gondola scan

 \ No newline at end of file
Added:
>
>
See eLog:224 for the description of that.

Revision 92009-07-23 - PeterWinter

Line: 1 to 1
 
META TOPICPARENT name="PeterWinterAnalysis"
-- PeterWinter - 16 Feb 2009

This page will keep a description of how I use to create the final lifetime histograms from the output shrubs from ./mta....

Subdividing the data sets into groups of same magnet setting

Changed:
<
<
First thing is to separate the runs into data sets according to Sara's classification scheme. One important step for that is to sort them into according groups of magnet settings (mainly +125A and -125A). That is done with ~/jobscripts/ExtractMagnetForDataSets.pl at NPL.
>
>
First thing is to separate the runs into data sets according to Sara's classification scheme. One important step for that is to sort them into according groups of magnet settings (mainly +125A and -125A). That can be done from the MySQL database. The extracted data set files are at ~/jobscripts/lists/datasets. The files in there are named prod_rn10_mu+_a_125 or prod_rn11_mu-_c and comprise a list of run numbers that belong to this data set.
 

Creating the links

Changed:
<
<
After the mta has produced the intermediate shrubs, it might be necessary to create appropriate links to the data sets. Since Brendan had the original trees in his directories on Abe in /cfs/projects/lqo/bkiburg/run11_4/treepass2/mta/, I created data set directories in /cfs/scratch/projects/lqo/pwinter/run11_4/treepass2/. Then, to create the links for each of the data sets, I have a macro in /u/ac/pwinter/jobscripts called MakeLinksForDatasets.pl. It will make the links according to the run numbers that are listed in the current $datasetfile = $HOME."/jobscripts/lists/datasets/".$dataset.".txt";
>
>
After the mta has produced the intermediate shrubs, it might be necessary to create appropriate links to the data sets. The original trees live usually on Abe in /cfs/projects/lqo/bkiburg/run11_4/treepass2/mta/, where treepass2 must be replaced with the according production that you are interested (currently ProdMay09). Then, to create the links for each of the data sets, I have a macro in /u/ac/pwinter/jobscripts called MakeLinksForDatasets.pl. It will make the links according to the run numbers that are listed in the current $datasetfile = $HOME."/jobscripts/lists/datasets/".$dataset.".txt"; . To for example create the links for run 10 data set a_125 for mu+, you have to run: > ./MakeLinksForDatasets.pl 11 prod_rn11_mu+_a_125 Pixel_;
Here the argument 11 denotes that this is run 11. The second argument is the data set name and the last states, that you want to do this for the Pixel shrubs, i.e. the Pixel_treeNNNNN.root files. You might need to adapt some directory paths in the script to your needs.
 
Changed:
<
<

Merging the shrubs on the nodes

Transfer to NPL

>
>

Merging the shrubs on the nodes and transfering to NPL

I have combined these steps into one macro
 

Conversion to lifetime histograms

  1. In order to create histograms, I have a script on NPL at ~/jobscripts/MakeHistogramExe.pl. If you run it, it requires thre input parameters:
    MakeHistogramExe.pl input_directory data_set_name prefix where

Revision 82009-04-20 - PeterWinter

Line: 1 to 1
 
META TOPICPARENT name="PeterWinterAnalysis"
-- PeterWinter - 16 Feb 2009
Line: 15 to 15
 

Transfer to NPL

Conversion to lifetime histograms

Changed:
<
<
  1. In order to create histograms, I have a script on NPL at ~/jobscripts/MakeHistogramExe.pl. If you run it, it requires thre input parameters:
MakeHistogramExe.pl input_directory data_set_name prefix where
>
>
  1. In order to create histograms, I have a script on NPL at ~/jobscripts/MakeHistogramExe.pl. If you run it, it requires thre input parameters:
    MakeHistogramExe.pl input_directory data_set_name prefix where
 
    • input_directory: Is the directory, where your shrub files are
    • data_set_name: It's the name of the current data_set
Changed:
<
<
    • prefix: Is the beginning of your shrub file names before the run number, i.e. if your shrubs are Pixel_34536-36776.root, then prefix="Pixel_"
If you run this script, it will loop over all the root files with this prefix and create a HistogramCreation_$data_set_name.exe file. You might need to adapt the script since it now expects shrub file names with a range of run numbers, i.e. Pixel_34536-36776.root rather than Pixel_34536.root! You should also edit the output directory in this MakeHistogramExe.pl to match your needs.
  1. You can now launch the HistogramCreation_$data_set_name.exe files from the step before. It will simply call several times ROOT wth the macro CreateHistogramsFromTreeWithLoop.C on the according shrub files. The CreateHistogramsFromTreeWithLoop.C will mainly produce various TH3 histogams with the axis:
>
>
    • prefix: Is the beginning of your shrub file names before the run number, i.e. if your shrubs are Pixel_34536-36776.root, then prefix="Pixel_"
      If you run this script, it will loop over all the root files with this prefix and create a HistogramCreation _$data_set_name.pl file. You might need to adapt the script since it now expects shrub file names with a range of run numbers, i.e. Pixel_34536-36776.root rather than Pixel_34536.root! You should also edit the output directory in this MakeHistogramExe.pl to match your needs.
  1. You can now launch the HistCreate_$data_set_name.pl files from the step before. This script expects a single argument indicating which of the root files that were found in the step above to run, i.e. 0=first ROOT file, 1=second ROOT file etc.
    Launching can be done with a small condor job. Use the launch_condor example setup file from CVS and change necessary directories/paths in there to adapt to your needs.
    The HistCreate_$data_set_name.pl will then call a ROOT script with appropriate arguments. This CreateHistogramsFromTreeWithLoop.C will mainly produce various TH3 histogams with the axis:
 
    • x-axis: Time dt (electron-muon)
Changed:
<
<
    • y-axis: Some parameter of your choice like gondola number, nContEH, extraEL
>
>
    • y-axis: Some parameter of your choice like gondola number, nContEH, extraEL (Depending on what you want to do, you need to create other histograms with the y-axis being adapted to the parameter of interest.)
 
    • z-axis: TPC subvolume number
Deleted:
<
<
Depending on what you want to do, you need to create other histograms with the y-axis being adapted to the parameter of interest.

(Condor job later?)

 

Fitting the data using the class TAllScans

Added:
>
>

Parameter scan for nContEH, nExtraEL

Gondola scan

Revision 72009-04-16 - PeterWinter

Line: 1 to 1
 
META TOPICPARENT name="PeterWinterAnalysis"
-- PeterWinter - 16 Feb 2009
Line: 15 to 15
 

Transfer to NPL

Conversion to lifetime histograms

Changed:
<
<
  • ~/MakeHistogramExe.pl
  • ~/HistogramCreation.exe
>
>
  1. In order to create histograms, I have a script on NPL at ~/jobscripts/MakeHistogramExe.pl. If you run it, it requires thre input parameters:
MakeHistogramExe.pl input_directory data_set_name prefix where
    • input_directory: Is the directory, where your shrub files are
    • data_set_name: It's the name of the current data_set
    • prefix: Is the beginning of your shrub file names before the run number, i.e. if your shrubs are Pixel_34536-36776.root, then prefix="Pixel_"
If you run this script, it will loop over all the root files with this prefix and create a HistogramCreation_$data_set_name.exe file. You might need to adapt the script since it now expects shrub file names with a range of run numbers, i.e. Pixel_34536-36776.root rather than Pixel_34536.root! You should also edit the output directory in this MakeHistogramExe.pl to match your needs.
  1. You can now launch the HistogramCreation_$data_set_name.exe files from the step before. It will simply call several times ROOT wth the macro CreateHistogramsFromTreeWithLoop.C on the according shrub files. The CreateHistogramsFromTreeWithLoop.C will mainly produce various TH3 histogams with the axis:
    • x-axis: Time dt (electron-muon)
    • y-axis: Some parameter of your choice like gondola number, nContEH, extraEL
    • z-axis: TPC subvolume number
Depending on what you want to do, you need to create other histograms with the y-axis being adapted to the parameter of interest.
  (Condor job later?)
Added:
>
>

Fitting the data using the class TAllScans

Revision 62009-04-16 - PeterWinter

Line: 1 to 1
 
META TOPICPARENT name="PeterWinterAnalysis"
-- PeterWinter - 16 Feb 2009

This page will keep a description of how I use to create the final lifetime histograms from the output shrubs from ./mta....

Subdividing the data sets into groups of same magnet setting

Changed:
<
<
First thing is separate the runs into data sets according to Sara's classification scheme. One important step for that is to sort them into according groups of magnet settings (mainly +125A and -125A). That is done with ~/jobscripts/ExtractMagnetForDataSets.pl at NPL.
>
>
First thing is to separate the runs into data sets according to Sara's classification scheme. One important step for that is to sort them into according groups of magnet settings (mainly +125A and -125A). That is done with ~/jobscripts/ExtractMagnetForDataSets.pl at NPL.
 

Creating the links

Changed:
<
<
After the mta has produced the intermediate shrubs, it might be necessary to create appropriate links to the data sets. Since Brendan had the original trees in his directories on Abe in /cfs/projects/lqo/bkiburg/run11_4/treepass2/mta/, I created data set directories in /cfs/scratch/projects/lqo/pwinter/run11_4/treepass2/. Then, to create the links for each of the data sets, I have a macro in /u/ac/pwinter/jobscripts called MakeLinksForDatasets.pl
>
>
After the mta has produced the intermediate shrubs, it might be necessary to create appropriate links to the data sets. Since Brendan had the original trees in his directories on Abe in /cfs/projects/lqo/bkiburg/run11_4/treepass2/mta/, I created data set directories in /cfs/scratch/projects/lqo/pwinter/run11_4/treepass2/. Then, to create the links for each of the data sets, I have a macro in /u/ac/pwinter/jobscripts called MakeLinksForDatasets.pl. It will make the links according to the run numbers that are listed in the current $datasetfile = $HOME."/jobscripts/lists/datasets/".$dataset.".txt";
 

Merging the shrubs on the nodes

Revision 52009-03-10 - PeterWinter

Line: 1 to 1
 
META TOPICPARENT name="PeterWinterAnalysis"
-- PeterWinter - 16 Feb 2009

This page will keep a description of how I use to create the final lifetime histograms from the output shrubs from ./mta....

Subdividing the data sets into groups of same magnet setting

Changed:
<
<
NPL: ~/ExtractMagnetForDataSets.pl
>
>
First thing is separate the runs into data sets according to Sara's classification scheme. One important step for that is to sort them into according groups of magnet settings (mainly +125A and -125A). That is done with ~/jobscripts/ExtractMagnetForDataSets.pl at NPL.
 

Creating the links

After the mta has produced the intermediate shrubs, it might be necessary to create appropriate links to the data sets. Since Brendan had the original trees in his directories on Abe in /cfs/projects/lqo/bkiburg/run11_4/treepass2/mta/, I created data set directories in /cfs/scratch/projects/lqo/pwinter/run11_4/treepass2/. Then, to create the links for each of the data sets, I have a macro in /u/ac/pwinter/jobscripts called MakeLinksForDatasets.pl

Revision 42009-02-18 - PeterWinter

Line: 1 to 1
 
META TOPICPARENT name="PeterWinterAnalysis"
-- PeterWinter - 16 Feb 2009

This page will keep a description of how I use to create the final lifetime histograms from the output shrubs from ./mta....

Subdividing the data sets into groups of same magnet setting

Added:
>
>
NPL: ~/ExtractMagnetForDataSets.pl
 

Creating the links

After the mta has produced the intermediate shrubs, it might be necessary to create appropriate links to the data sets. Since Brendan had the original trees in his directories on Abe in /cfs/projects/lqo/bkiburg/run11_4/treepass2/mta/, I created data set directories in /cfs/scratch/projects/lqo/pwinter/run11_4/treepass2/. Then, to create the links for each of the data sets, I have a macro in /u/ac/pwinter/jobscripts called MakeLinksForDatasets.pl
Line: 16 to 17
 

Conversion to lifetime histograms

Added:
>
>
  • ~/MakeHistogramExe.pl
  • ~/HistogramCreation.exe
  (Condor job later?) \ No newline at end of file

Revision 32009-02-18 - PeterWinter

Line: 1 to 1
 
META TOPICPARENT name="PeterWinterAnalysis"
-- PeterWinter - 16 Feb 2009

This page will keep a description of how I use to create the final lifetime histograms from the output shrubs from ./mta....

Added:
>
>

Subdividing the data sets into groups of same magnet setting

 

Creating the links

After the mta has produced the intermediate shrubs, it might be necessary to create appropriate links to the data sets. Since Brendan had the original trees in his directories on Abe in /cfs/projects/lqo/bkiburg/run11_4/treepass2/mta/, I created data set directories in /cfs/scratch/projects/lqo/pwinter/run11_4/treepass2/. Then, to create the links for each of the data sets, I have a macro in /u/ac/pwinter/jobscripts called MakeLinksForDatasets.pl

Revision 22009-02-16 - PeterWinter

Line: 1 to 1
 
META TOPICPARENT name="PeterWinterAnalysis"
-- PeterWinter - 16 Feb 2009

This page will keep a description of how I use to create the final lifetime histograms from the output shrubs from ./mta....

Creating the links

Changed:
<
<
>
>
After the mta has produced the intermediate shrubs, it might be necessary to create appropriate links to the data sets. Since Brendan had the original trees in his directories on Abe in /cfs/projects/lqo/bkiburg/run11_4/treepass2/mta/, I created data set directories in /cfs/scratch/projects/lqo/pwinter/run11_4/treepass2/. Then, to create the links for each of the data sets, I have a macro in /u/ac/pwinter/jobscripts called MakeLinksForDatasets.pl
 

Merging the shrubs on the nodes

Revision 12009-02-16 - PeterWinter

Line: 1 to 1
Added:
>
>
META TOPICPARENT name="PeterWinterAnalysis"
-- PeterWinter - 16 Feb 2009

This page will keep a description of how I use to create the final lifetime histograms from the output shrubs from ./mta....

Creating the links

Merging the shrubs on the nodes

Transfer to NPL

Conversion to lifetime histograms

(Condor job later?)

 
This site is powered by the TWiki collaboration platform Powered by PerlCopyright © 2008-2019 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback