Skip to content
jjswan33 edited this page Sep 25, 2014 · 1 revision

This page will describe the use of the singleLepton repository for doing a single lepton analysis using LJMet.

Checking out the code

Check out the LJMet repository following the instructions here:

LJMet Twiki

git clone [email protected]:cms-ljmet/Ljmet-singlelepton.git LJMet/singleLepton

Running LJMet Jobs

Running LJMet jobs with singletPrime is done using configuration files in singleLepton/condor. It is setup to run on either the LPC or Brux.

Running jobs is successfully completed by running the following command:

python singleLeptonSubmit.py --useMC True --sample TpBW --fileList TprimeToBW_750_Madgraph.txt --submit True --local True

  • --useMC - specifies if you are running MC or data
  • --sample - specifies the job name and output directory
  • --fileList - contains a list of the files/location of the patuples that you are running on. You can get this from DAS by finding the dataset and the py option. You can copy the full file.
  • --submit - specifies whether you want to submit the jobs or not. Setting false will create the individual jobs but not submit to condor
  • --local - if set to false it will try to find the dataset with xrootd

You can specify multiple jobs to submit with an included bash script inside the condor director:

./submit true true true

the bools specify which blocks to submit for signal background data respectively. Lines can be commented and uncommented respectively

Merging files

There is also a bash script for merging the jobs. This will hadd all of the jobs for the different datasets and then weight them with a pileup weight and normalize the cross section to 1/pb of luminosity in a single weight of __WEIGHT__. The jobs are output in the sandbox directory

./merge.sh

Clone this wiki locally