Difference: ARATutorialCCIN2P3 (1 vs. 13)

Revision 1003 Sep 2008 - Main.FR

Line: 1 to 1
 
Changed:
<
<

Goals of this tutorial

>
>

Athena Release 13.0.40 tutorial

 
Changed:
<
<
The ATLAS analysis data format is evolving...This tutorial aims to introduce in a simplified and clear manner the recently introduced new data format: the Derived Physics Dataset (DPD).
>
>
http://atlas-france.in2p3.fr/cgi-bin/twiki/bin/view/Atlas/ARATutorialCCIN2P3AthenaRelease13
 
Changed:
<
<
Questions like: what is a DPD, what does it contain, how can i create my own DPD, how can i run centrally maintained tools for DPD production and finally how can i analyse with both ATHENA and ROOT a DPD are addressed.
>
>

Athena Release 14.0.20 tutorial

 
Changed:
<
<
This tutorial is meant for ATHENA release 13.0.40 only (expect maybe for the event generation and simulation parts). Thus it will be updated accordingly for release 14.0.X.

Moreover, a FAQ section with redundant issues will be maintained.

Setting up your ATHENA session at CCIN2P3 for release 13.0.40

The following settings are described in more details at this URL: GuideAthena twiki.

In all what follows, we assume that your default shell is bash (check your $SHELL environment variable). If you're using tcsh, customise the shell commands accordingly.

Finally, in this tutorial, we use $HOME/athena/13.0.40 as ATLAS_TEST_AREA. If you've got disk space issues, and if you've a $GROUP_DIR/$LOGNAME area, change in all what follows the $HOME variable to $GROUP_DIR/$LOGNAME

Setting up ATHENA 13.0.40

The settings below are of course meant for people who have an account on the CCIN2P3 machines! For an ATHENA kit or lxplus, simply follow the usual explanations on how to setup your ATHENA environment on this twiki at CERN: WorkBookSetAccount.

# connect to a SLC3 32-bit node
ssh -X ccalisl3.in2p3.fr

# you can connect to the SLC4 32-bit node as well, but then below, you need to make sure that you use the slc4 tag!!!!!
ssh -X ccalisl432.in2p3.fr

# create your ATHENA working area
mkdir -p $HOME/athena/13.0.40/cmt
cd $HOME/athena/13.0.40/cmt

# copy this requirement file
cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/requirements .

# source this script only once!
source /afs/in2p3.fr/group/atlas/sw/prod/releases/rel_13-4/CMT/v1r20p20070720/mgr/setup.sh

# configure and produce the setup scripts (only once!)
cmt config

# each time you login and want to setup your ATHENA 13.0.40 with AtlasProduction-13.0.40.3  release, do:
source ~/athena/13.0.40/cmt/setup.sh -tag=13.0.40.3,32,AtlasProduction,slc3

#if you connected to an SLC4 32 bit node, use slc4 as tag!!!!
source ~/athena/13.0.40/cmt/setup.sh -tag=13.0.40.3,32,AtlasProduction,slc4

# you can check that things are consistent, by printing the CMTPATH variable. For instance, user <tt>ghodban</tt> should see:
echo $CMTPATH
/afs/in2p3.fr/home/g/ghodban/athena/13.0.40:/afs/in2p3.fr/group/atlas/sw/prod/releases/rel_13-4/AtlasProduction/13.0.40.3

Setting up the ATLAS CVS repository

We will need to check out some packages. CCIN2P3 hosts a CVS mirror of the ATLAS software. Here are the basic settings required to get it work.
Create or append your .ssh/config file (read by ssh for configuration) with:
   
Host anoncvs.in2p3.fr
User atlascvs
Port 2222    
PubkeyAuthentication no
RSAAuthentication no
PasswordAuthentication yes
ForwardX11 no
You can get this file from this location:
cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/config ~/.ssh/.

Don't forget to setup these two environment variables to get CVS work (use setenv if your $SHELL is tcsh):

export CVSROOT=atlascvs@anoncvs.in2p3.fr:/atlascvs
export CVS_RSH=ssh
export CMTCVSOFFSET=offline

Now, you should be able to check out packages.

Producing a MCatnlo Atlfast AOD

before going further, we are going to spend some time to show how to use the Mcatnlo program for event generation at 10 TeV?. This is done in three steps:

Installing Mcatnlo 3.2

first, install the Mcatnlo event generator. For disk space issues, we assume that you've got a $GROUP_DIR$ area:
cd $GROUP_DIR/$LOGNAME

In this directory, we download the version 3.2 of the Mcatnlo program.

mkdir mcatnlo
cd mcatnlo
# get the 3.2 version
wget http://www.hep.phy.cam.ac.uk/theory/webber/MCatNLO/Package32.tar.gz

# untar
tar zxvf Package32.tar.gz

In order to use Mcatnlo with the les Houches Accord PDF LHAPDF, we need to:

  • do some changes to the McatNLO.input file to tell the installation script where the LHAPDF distributed with the ATLAS 13.0.40 kit is located, where the HERWIG include files are, etc...

  • do some changes to the Makefile, mcatnlo_uti.f and mcatnlo_lhauti.f files such that we avoid clashes between Fortran function names...

Simply copy the files for which these issues are fixed:

cd $GROUP_DIR/$LOGNAME/mcatnlo
cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/mcatnlo32/* .
Now you're ready to compile and then run Mcatnlo
./MCatNLO.input
Running this script, you will automatically create a directory called Linux (this name can be different if you're using another Operating System) in which you will find the ttbNLO_EXE_LHAPDF ready for use.

Producing t%bar{t} event with Mcatnlo 3.2

Let's run it. To do this, create one file parameters like:


 'mc13.005200'       ! prefix for BASES files
 'mc13.005200'       ! prefix for event files
 10000 1 1 1 1 ! energy, fren, ffact, frenmc, ffactmc
 -1706                          ! -1705/1706=bb/tt
 173                        ! M_Q
 0.32 0.32 0.5 1.55 4.95 0.75 ! quark and gluon masses
 'P'  'P'               ! hadron types
 'LHAPDF'   10000            ! PDF group and id number
  -1                     ! Lambda_5, <0 for default
 'MS'                   ! scheme
  10000                          ! number of events
  1                        ! 0 => wgt=+1/-1, 1 => wgt=+w/-w
  0                      ! seed for rnd numbers
  0.3                             ! zi
 10 10                 ! itmx1,itmx2

Then simply run:
cd $GROUP_DIR/$LOGNAME/mcatnlo/Linux

ttbNLO_EXE_LHAPDF < ../mc13.005200.input

This will produce an event file mc13.005200.events ready to be used with ATHENA. The preparation of this file is described in the next section.

Producing Generator Pool Files (Parton-shower/Hadronization/Photos/Tauola/Jimmy)

The Mcatnlo event file mc13.005200.events contains events ready to be processed with Herwig for parton shower and hadronisation and by Jimmy for underlying events. It is rather important to stick to ATLAS official tools.
Let's prepare the Mcatnlo event file first. We need to change the name for the LHAPDF used by the Mcatnlo_i interface:

cd  $GROUP_DIR/$LOGNAME/mcatnlo32/Linux
sed -i 's/LHAPDF    10000/HWLHAPDF  10000/' mc13.005200.events

Next, we need to prepare the Mcatnlo_i input parameter file:

 'mc13.005200.events'        ! event file
  1000                         ! number of events
  1                        ! 0->Herwig PDFs, 1 otherwise
 'P'  'P'               ! hadron types
  5000.00 5000.00               ! beam momenta
  -1706                          ! -1705/1706=bb/tt
 'HWLHAPDF'                      ! PDF group (1). It was MRS in the original ttMCinput file
  10000                       ! PDF id number (1). It was 105 in the original ttMCinput file
 'HWLHAPDF'                      ! PDF group (2). It was MRS in the original ttMCinput file
  10000                       ! PDF id number (2). It was 105 in the original ttMCinput file
  -1                     ! Lambda_5, < 0 for default
  173                        ! M_Q
  0.32 0.32 0.5 1.55 4.95 0.75 ! quark and gluon masses

You can copy it from this location:
cd $GROUP_DIR/$LOGNAME/mcatnlo/Linux/.
cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/mcatnlo32/mc13.005200.dat .

We then need to put the two files mc13.005200.dat and mc13.005200.events in an archive mc13.005200.tar.gz used by ATHENA.

cd  $GROUP_DIR/$LOGNAME/mcatnlo/Linux
tar zcvf mc13.005200.tar.gz mc13.005200.events mc13.005200.dat

We need to check-out this tag for which a fix is provided such that one can run at 10TeV centre of mass energy:

cd ~/athena/13.0.40
cmt co -r EvgenJobTransforms-00-06-04 Generators/EvgenJobTransforms
cd Generators/EvgenJobTransforms/*/cmt
source setup.sh
gmake

Let's prepare the input file for EvgenJobTransforms. First, we need to prepare the generic file needed by the ATHENA McatNLO_i interface .
First, get the ATHENA job option file which was used for the CSC production.
cd $GROUP_DIR/$LOGNAME/mcatnlo/Linux

# retrieve the CSC job option file and fix the input file name prefix (hard-coded)
get_files  CSC.005200.T1_McAtNlo_Jimmy.py
sed -i s"/mcatnlo31.005200.ttbar/mc13.005200/" CSC.005200.T1_McAtNlo_Jimmy.py

Then run it:

csc_evgen08_trf.py -t -l INFO runNumber=00001 firstEvent=1 maxEvents=1000 randomSeed=10000 jobConfig=CSC.005200.T1_McAtNlo_Jimmy.py outputEvgenFile=mc13.005200.gen.pool.root  inputGeneratorFile=mc13.005200.tar.gz  &>mc13.005200.log.1&

This should produce a generator POOL file, that one can then use for full simulation or fast simulation with Atlfast. This step is described in the next section.
Note that you can already check the contents of this generator POOL file by using the checkFile.py facility:

cd $GROUP_DIR/$LOGNAME/mcatnlo/Linux
checkFile.py mc13.005200.gen.pool.root &> pool.contents

## opening file [mc13.005200.gen.pool.root]...
## importing ROOT...
## importing ROOT... [DONE]
## opening file [OK]
File:mc13.005200.gen.pool.root
Size:    37173.304 kb
Nbr Events: 1115

================================================================================
     Mem Size       Disk Size        Size/Evt      items  (X) Container Name (X=Tree|Branch)
================================================================================
     796.464 kb       31.687 kb        0.028 kb     1115  (T) DataHeader
--------------------------------------------------------------------------------
     288.518 kb        7.508 kb        0.007 kb     1115  (B) EventInfo_p2_McEventInfo
   96966.967 kb    36760.912 kb       32.969 kb     1115  (B) McEventCollection_p3_GEN_EVENT
================================================================================
   98051.949 kb    36800.107 kb       33.005 kb     1115  TOTAL (POOL containers)
================================================================================
## Bye.

Producing Atlfast AODs

The production of Atlfast AODs is rather easy and straightforward when using csc_atlfast_trf.py .
csc_atlfast_trf.py ntupleFile=atlfast.root maxEvents=1000 skipEvents=0 outputAODFile=mc13.005200.atlfast.pool.root  inputEvgenFile=mc13.005200.gen.pool.root  &>mc13.005200.log.2&
We can check the contents of this Atlfast AOD to get familiar with the different container names for instance... To do this, simply use the checkFile.py script.

## opening file [mc13.005200.atlfast.pool.root]...
## importing ROOT...
## importing ROOT... [DONE]
## opening file [OK]
File:mc13.005200.atlfast.pool.root
Size:    36793.779 kb
Nbr Events: 1000

================================================================================
     Mem Size       Disk Size        Size/Evt      items  (X) Container Name (X=Tree|Branch)
================================================================================
    2916.702 kb      114.783 kb        0.115 kb     1000  (T) DataHeader
--------------------------------------------------------------------------------
     335.096 kb        0.000 kb        0.000 kb     1000  (B) ElectronContainer_p1_AtlfastElectronCollection
     273.462 kb        0.000 kb        0.000 kb     1000  (B) PhotonContainer_p1_AtlfastPhotonCollection
     292.480 kb        0.000 kb        0.000 kb     1000  (B) TauJetContainer_p1_AtlfastTauJetContainer
     605.093 kb        0.732 kb        0.001 kb     1000  (B) MuonContainer_p1_AtlfastNonIsoMuonCollection
      68.544 kb        1.343 kb        0.001 kb     1000  (B) TruthParticleContainer_p5_SpclMC
     652.330 kb        1.511 kb        0.002 kb     1000  (B) MuonContainer_p1_AtlfastMuonCollection
      70.099 kb        4.581 kb        0.005 kb     1000  (B) MissingET_p1_AtlfastMissingEt
     256.088 kb        7.508 kb        0.008 kb     1000  (B) EventInfo_p2_McEventInfo
     528.524 kb       64.473 kb        0.064 kb     1000  (B) TauJetContainer_p1_AtlfastTauJet1p3pContainer
    2130.056 kb      313.702 kb        0.314 kb     1000  (B) INav4MomAssocs_p2_AtlfastMcAodAssocs
    4189.289 kb      360.320 kb        0.360 kb     1000  (B) ParticleJetContainer_p1_Cone4TruthParticleJets
    4209.423 kb      367.841 kb        0.368 kb     1000  (B) ParticleJetContainer_p1_Kt4TruthParticleJets
    4423.054 kb      380.327 kb        0.380 kb     1000  (B) ParticleJetContainer_p1_Cone7TruthParticleJets
    4608.956 kb      385.483 kb        0.385 kb     1000  (B) ParticleJetContainer_p1_Kt6TruthParticleJets
    2627.185 kb      592.927 kb        0.593 kb     1000  (B) ParticleJetContainer_p1_AtlfastParticleJetContainer
   19454.810 kb     7827.819 kb        7.828 kb     1000  (B) Rec::TrackParticleContainer_tlp1_AtlfastTrackParticles
   61318.476 kb    23965.933 kb       23.966 kb     1000  (B) McEventCollection_p3_GEN_AOD
================================================================================
  108959.667 kb    34389.283 kb       34.389 kb     1000  TOTAL (POOL containers)
================================================================================
## Bye.

Now we are ready to analyse this AOD and produce DPDs. This is the purpose of the next sections.

Producing DPDs from AODs

Some definitions

The current Event Data Model introduces a new data format, the Derived Physics Dataset (DPD) with the aim to reduce further the size of the analysis objects. Thus, a DPD is defined as a set of data which is a subset of ESD or AOD data content with the possible addition of analysis data, analysis data being defined as quantities derived from data in ESD or AOD.
Reducing the size of an event is done in three steps:


* Skimming: selecting only interesting events based on some event level quantity like e.g.:

  • number of electrons

  • Missing ET

The skimming can be implemented cutting directly on TAG files.


* Thinning: selecting only interesting objects from a container, like e.g.:

  • keep all electrons with pT > 20 GeV?


* Slimming: selecting only interesting properties of an object, like e.g.:

  • drop some of the calorimeter informations out of an electron

Some useful acronyms that you need to be familiar with:

* A D1PD is defined as a centrally produced DPD from an AOD or a set of AODs using the working group DPD tool maker, like e.g. for the top working group: TopPhysDPDMaker

* A D2PD is defined as a privately made, or customised DPD produced from a D1PD or an AOD

* A D3PD is defined as an ntuple made from a D1PD or a D2PD or an AOD. This is somehow similar to the EventView?/TopView approach.

This tutorial will focus mainly on the production of D1PD and D2PD. D3PDs are not considered here.

Finally if you want to follow the evolution of the current scheme, you need to keep an eye on the following packages:

  • PhysicsAnalysis/AthenaROOTAccess

  • PhysicsAnalysis/AthenaROOTAccessExamples

  • PhysicsAnalysis/DPDUtils

and watch these two hypernews:

Finally, if you're interested by the recommendations of the Analysis Model Report you can view it as a (PDF):

Example 1: Produce a DPD from an AOD

Previously we have seen how to produce an Atlfast AOD, we would like now to produce a DPD from it. There is not much to know to do this. You simply need to define an ATHENA POOL output stream AthenaPoolOutputStream that you can call StreamDPD and tell which items it should contain by adding them to the AthenaPoolOutputStream::ItemList.

The script looks like:

#------------------------------------------------------------------------
# import the data types
import EventKernel.ParticleDataType

#------------------------------------------------------------------------
# get a handle on the ServiceManager which holds all the services
from AthenaCommon.AppMgr import ServiceMgr

#------------------------------------------------------------------------
# Particle Properties
from PartPropSvc.PartPropSvcConf import PartPropSvc

#------------------------------------------------------------------------
# the Converters
import AthenaPoolCnvSvc.ReadAthenaPool

include ( "ParticleBuilderOptions/ESD_PoolCnv_jobOptions.py" )
include ( "ParticleBuilderOptions/AOD_PoolCnv_jobOptions.py" )
include ( "ParticleBuilderOptions/McAOD_PoolCnv_jobOptions.py" )
include ( "EventAthenaPool/EventAthenaPool_joboptions.py" )

#------------------------------------------------------------------------
# our DPD contents will be defined here below
from AthenaPoolCnvSvc.WriteAthenaPool import AthenaPoolOutputStream
StreamDPD = AthenaPoolOutputStream( "StreamDPD" )

# We want first to store everything which is an EventInfo object
StreamDPD.ItemList  =  ['EventInfo#*']

# We want to store the muons smeared by the Atlfast simulation
StreamDPD.ItemList += ['Analysis::MuonContainer#AtlfastMuonCollection']

#- we want to keep electrons as well
StreamDPD.ItemList += ["ElectronContainer#AtlfastElectronCollection"]

#- and of course the jets (Atlfast jets are by default using a Cone  algorithm with R=0.4
StreamDPD.ItemList += ['ParticleJetContainer#AtlfastParticleJetContainer']

#------------------------------------------------------------------------
StreamDPD.ForceRead= TRUE
StreamDPD.OutputFile= 'DPD.pool.root'

#------------------------------------------------------------------------
ServiceMgr.MessageSvc = Service( "MessageSvc" )
ServiceMgr.MessageSvc.OutputLevel = INFO
ServiceMgr.MessageSvc.Format = "% F%75W%S%7W%R%T %0W%M"
ServiceMgr.MessageSvc.defaultLimit=10000000

#------------------------------------------------------------------------
ServiceMgr.EventSelector.InputCollections = ['rfio:cchpssatlas.in2p3.fr://hpss/in2p3.fr/home/g/ghodban/CCIN2P3-27052008/AOD/mc13.005200.10TeV.atlfast.AOD._00200.v13004002.pool.root']

#------------------------------------------------------------------------
# Number of Events to process
theApp.EvtMax = -1

Test: copy this script and execute it to produce a DPD from the Atlfast AOD


# if not already created (or another directory where you've got enough disk space)
cd $GROUP_DIR/$LOGNAME
cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/Example1_jobOptions.py .
athena Example1_jobOptions.py &>example1.log

Example 2: Produce a DPD with a filter from an AOD

In this example, we want to do exactly the same thing as in the previous section, but this time, we would like to write in the DPD only events which fulfill the following conditions:

* at least one lepton,

* at least four jets,

* a missing transverse energy above 20 GeV?.

The implementation of this filter is straightforward as shown below. We define an algorithm called myFilter which inherits from PyAlgorithm. This algorithm has three methods initialize called once at the beginning, finalize called at the end and execute called at each event. In the execute method, we retrieve from StoreGate the electron, muon and jet containers as well as the missing transverse energy using the PyParticleTools methods. Once these containers are loaded, we can either loop over the particles or simply, as done in this example, count the number of particles.

from gaudimodule import PyAlgorithm
import PyParticleTools.PyParticleTools as PyParticleTools
import PyAnalysisCore.PyEventTools as PyEventTools

#---------------------------------------------------------------------------
class myFilter( PyAlgorithm ):
#---------------------------------------------------------------------------
    def __init__ ( self, name ) :
        PyAlgorithm.__init__(self,name)
#---------------------------------------------------------------------------
    def initialize(self):
        print "Initializing myFilter"
        return True
#---------------------------------------------------------------------------
    def finalize(self):
        return True            
#---------------------------------------------------------------------------
    def execute(self):
        # we assume this is a good event
        self.setFilterPassed(True)
        # retrieve the lepton containers
        ElectronContainer  = PyParticleTools.getElectrons("AtlfastElectronCollection")
        MuonContainer      = PyParticleTools.getMuons("AtlfastMuonCollection")
        # require at least one lepton
        if ElectronContainer.size() + MuonContainer.size() < 1:
            self.setFilterPassed(False)

        # retrieve jets
        ParticleJetContainer = PyParticleTools.getParticleJets("AtlfastParticleJetContainer")
        # require at least four jets
        if ParticleJetContainer.size() < 4:
            self.setFilterPassed(False)
        # retrieve the missing ET and require it to be above 20 GeV
        MissingET        = PyParticleTools.getMissingET("AtlfastMissingEt")
        et = MissingET.et()
        if et < 20000:
            self.setFilterPassed(False)
        if self.filterPassed():
            print "Filter passed: a good event!"
        else:
            print "Filter failed: a bad event!"
        return True

Of course this filter can be extended to select particles in a given #eta range and with a pT threshold. To know all the methods that one can use for electrons, muons and Jets, have a look at these ATHENA classes:

* Analysis::Electron

* Analysis::Muon

* ParticleJet

This filter is then attached to the StreamDPD by adding to the Example1_jobOptions.py script the following lines:

include( "PyAnalysisCore/InitPyAnalysisCore.py")
include("Example2_filter.py")

myFilter= myFilter("myFilter")
theApp.TopAlg += ["myFilter"]
StreamDPD.AcceptAlgs=["myFilter"]
We first initialize the Python based analysis Code software (required, or you will not be able to access the containers via StoreGate?). Then, we include our filter that we call Example2_filter.py. we instanciate a filter algorithm that we call myFilter, we don't forget to add it to the Algorithm sequence theApp.TopAlg and then, we attach this algorithm to the StreamDPD AthenaPoolOutputStream? which will be executed only if the myFilter is passed ( myFilter::filterPassed() )

Test: copy the Example2_jobOptions.py script and execute it to produce a DPD from the Atlfast AOD

cd $GROUP_DIR/$LOGNAME
cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/Example2_filter.py .
cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/Example2_jobOptions.py .
athena Example2_jobOptions.py &>example2.log

At this level, you might want to improve the above filter. If you're not familiar with the methods used to access particle properties, you can have a look at the self explanatory examples given in the PhysicsAnalysis/DPDUtils.

Producing a private D2PD from this D2PD

From the DPD.pool.root that you just produced, produce a new DPD using Example1_jobOptions.py. Don't forget to set ServiceMgr.EventSelector.InputCollections and StreamDPD.OutputFile accordingly

Producing a D1PD using the TopPhysDPDMaker?

Brief description of TopPhysDPDMaker?

A set of tools are currently being developed for DPD creation. Most for not saying all of them, are inspired from the various examples that one can find in the PhysicsAnalysis/DPDUtils package.
For the case of the Top Working Group, a D1PD, D2PD and D3PD tool was recently released and supported from releases 13.0.40 upwards: TopPhysDPDMaker before going further, we look at what this tool does and how to run it.
Let's first compile it:
cd $HOME/athena/13.0.40
cmt co -r TopPhysDPDMaker-00-00-11 PhysicsAnalysis/TopPhys/TopPhysDPDMaker
cd PhysicsAnalysis/TopPhys/TopPhysDPDMaker/*/cmt
source setup.sh
make

Let's look at its structure just to understand what is behind it.

ls $HOME/athena/13.0.40/PhysicsAnalysis/TopPhys/TopPhysDPDMaker/*/share
 
CVS                         D1PDSlimming_jobOptions.py     ElectroweakD2PD_topOptions.py  MetaData_jobOptions.py    TopPhysDPDSubmission
D1PDItemList_jobOptions.py  ElectroweakD1PD_topOptions.py  ElectroweakD3PD_topOptions.py  TopDPD_ProdOptionsAOD.py  TriggerObjectItemList_jobOptions.py

The different files in the TopPhysDPDMaker/share directory are:

* TopPhysDPDSubmission: a script for GRID submission using PATHENA

* TopDPD_ProdOptionsAOD.py: a D1PD production job option

* ElectroweakD1PD_topOptions.py: the D1PD production script. Let's edit it to get familiar with it. First, you can see this line:

from TopPhysDPDMaker.inclusive_lepFilterAlgorithm import *
We importe inclusive_lepFilterAlgorithm defined in the file python/inclusive_lepFilterAlgorithm.py and which is inspired from what one can find in the DPDUtils/share/semilep_ttbarFilterAlgorithm.py.
Looking at this inclusive_lepFilterAlgorithm.py file, we see that the inclusive lepton filter is passed if the event contains at least one lepton electron or muon within eta and pT range (both collections MuidMuonCollection? AND StacoMuonCollection? are considered on the same feet).

Next, we see in the ElectroweakD1PD_topOptions.py, this line:
from TopPhysDPDMaker.slimJets import *
Again, here we import the python/slimjets.py script (algorithm) with which the jet collections: Cone4H1TowerParticleJets, Cone4H1TopoParticleJets and Kt6H1TowerParticleJets are slimmed using ParticleJet::removeAll() and ParticleJet::removeInfo()methods.

Then, we call the slimTracks algorithm (implemented in python/slimTracks.py):
from TopPhysDPDMaker.slimTracks import *
to slim all tracks from the TrackParticleCandidate? container with pT < 5 GeV?.

Then, we see the creation of the DPD with the constraint that the inclusive_lepFilter is fulfilled.
StreamDPD = AthenaPoolOutputStream( "StreamDPD" )
StreamDPD.AcceptAlgs=["inclusive_lepFilter"]

The D1PD item list is defined by including:

include("TopPhysDPDMaker/D1PDItemList_jobOptions.py")
Let's have a look at it and see what is stored in the produced D1PD:
# Items for primary DPD
StreamDPD.ItemList =  ['EventInfo#*']
StreamDPD.ItemList += ['Rec::TrackParticleContainer#TrackParticleCandidate']
StreamDPD.ItemList += ['VxContainer#VxPrimaryCandidate']
StreamDPD.ItemList += ['ParticleJetContainer#Kt6H1TowerParticleJets']
StreamDPD.ItemList += ['ParticleJetContainer#Cone4H1TowerParticleJets']
StreamDPD.ItemList += ['ParticleJetContainer#Cone4H1TopoParticleJets']
StreamDPD.ItemList += ['egammaContainer#ElectronAODCollection']
StreamDPD.ItemList += ['egammaContainer#PhotonAODCollection']
StreamDPD.ItemList += ['egDetailContainer#egDetailAOD']
StreamDPD.ItemList += ['Analysis::TauJetContainer#*']
StreamDPD.ItemList += ['Analysis::TauDetailsContainer#*']
StreamDPD.ItemList += ['Rec::TrackParticleContainer#MuonboyMuonSpectroOnlyTrackParticles']
StreamDPD.ItemList += ['Rec::TrackParticleContainer#StacoTrackParticles']
StreamDPD.ItemList += ['MissingET#MET_RefFinal']
StreamDPD.ItemList += ['Analysis::MuonContainer#StacoMuonCollection']
StreamDPD.ItemList += ['Analysis::MuonContainer#MuidMuonCollection']
StreamDPD.ItemList += ['Analysis::MuonContainer#CaloMuonCollection']
StreamDPD.ItemList += ['Rec::MuonSpShowerContainer#*']


  • ElectroweakD2PD_topOptions.py: job script for producing D2PD, so far is only a copy of ElectroweakD1PD_topOptions.py and this is script that we will use to produce our D2PD

  • ElectroweakD3PD_topOptions.py: job script for producing D3PD (ntuples a la Topview) with analysis objects (not considered here!).

In order to produce a D3PD indeed an EventView?/TopView ntuple, you need to check out these packages:
cmt co -r TopPhysTools-13-00-40-07  PhysicsAnalysis/TopPhys/TopPhysTools
cmt co -r EventViewUserData-00-01-18-04  PhysicsAnalysis/EventViewBuilder/EventViewUserData
cmt co -r HighPtView-00-01-11 PhysicsAnalysis/HighPtPhys/HighPtView
cmt co -r HEAD PhysicsAnalysis/TopPhys/TopPhysUtils/HitFit
and follow the explanations given on this Twiki for setting up the group area

Producing a D2PD with TopPhysDPDMaker?

To create a D2PD using the TopPhysDPDMaker? tool, you need to use the share/ElectronweakD2PD_topOptions.py script. To do this, create a job option file with the following commands:

# AOD that you want to run other
InputCollections = ['rfio:cchpssatlas.in2p3.fr://hpss/in2p3.fr/home/g/ghodban/CCIN2P3-27052008/AOD/trig1_misal1_mc12.005200.T1_McAtNlo_Jimmy.recon.AOD.v13003004_tid018416._00002.pool.root.1']

# maximum number of events to process (this is not the number of events to be written to the DPD!)
EvtMax=10

include("TopPhysDPDMaker/ElectroweakD2PD_topOptions.py")

If you run this script, it will crash with the following error message:

IOVDbMgr            ERROR Unable to get default connection to COOL Conditions database.
IOVDbMgr            ERROR    Please set job option: IOVDbSvc.dbConnection  = <db connection string>
IOVDbSvc            ERROR Unable to get dbConnection - empty connection string 
ServiceManager      ERROR Unable to initialize Service: DetectorStore

This is somehow consistent with what was reported on the hypernews forums. As explained on the TopphysDPDMaker? twiki, release 13 is a transition release since:

  • 13.0.30 -> no dB, but trigger config present event-by-event in the AOD

  • fdr -> dB exists, plus we have DataStore?? metaheader

  • 13.0.40 is a sort of transition, there's no dB, no AOD info, no DataStore??

The suggested solution is to simply drop the trigger part as this is suggested in the D3PD case. To do this for the D2PD case, you need to edit TopPhysDPDMaker/*/share/MetaData_jobOptions.py and comment out these lines:

# Metadata Info for Trigger Configuration
# import IOVDbSvc.IOVDb
# svcMgr.IOVDbSvc.Folders += [ "/TRIGGER/HLT/Menu" ]
# svcMgr.IOVDbSvc.Folders += [ "/TRIGGER/HLT/HltConfigKeys" ]
# svcMgr.IOVDbSvc.Folders += [ "/TRIGGER/LVL1/Lvl1ConfigKey" ]
# svcMgr.IOVDbSvc.Folders += [ "/TRIGGER/LVL1/Menu" ]
# svcMgr.IOVDbSvc.Folders += [ "/TRIGGER/LVL1/Prescales" ]

# svcMgr.IOVDbSvc.FoldersToMetaData += [ "/TRIGGER/HLT/Menu" ]
# svcMgr.IOVDbSvc.FoldersToMetaData += [ "/TRIGGER/HLT/HltConfigKeys" ]
# svcMgr.IOVDbSvc.FoldersToMetaData += [ "/TRIGGER/LVL1/Lvl1ConfigKey" ]
# svcMgr.IOVDbSvc.FoldersToMetaData += [ "/TRIGGER/LVL1/Menu" ]
# svcMgr.IOVDbSvc.FoldersToMetaData += [ "/TRIGGER/LVL1/Prescales" ]

Then you will be able to run the D2PD production tool. You can also simply copy this file from this directory:

cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/MetaData_jobOptions.py $HOME/athena/13.0.40/PhysicsAnalysis/TopPhys/TopPhysDPDMaker/*/share/.

You can copy this script from the tutorial repository:

# go the TopPhysDPDMaker test area
cd  $HOME/athena/13.0.40/PhysicsAnalysis/TopPhys/TopPhysDPDMaker/*/test

# copy this script
cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/Example_TopPhysDPDMaker_D2PD.py .

# run it
athena Example_TopPhysDPDMaker_D2PD.py &> D2PD.log

You can of course add things that you think are missing in your D2PD. This is straightforward: in the above script, Example_TopPhysDPDMaker_D2PD.py , add at the end to the streamDPD the list of items which you need.
For instance, as you can see from the D1PDItem_list.py file, the calibrated topological clusters using Local Hadron Calibration are missing and thus, you cannot rerun a jet algorithm like e.g. the kT algorithm.
In order to solve this issue, simply add the container to the item list at the end of your script.

InputCollections = ['rfio:cchpssatlas.in2p3.fr://hpss/in2p3.fr/home/g/ghodban/CCIN2P3-27052008/AOD/trig1_misal1_mc12.005200.T1_McAtNlo_Jimmy.recon.AOD.v13003004_tid018416._00002.pool.root.1']
EvtMax=10

include("TopPhysDPDMaker/ElectroweakD2PD_topOptions.py")

# add the calibrated topo-clusters (to be able to re-run jet algorithms ;-)
StreamDPD.ItemList += ['CaloClusterContainer#CaloCalTopoCluster']

Rerun you Example_TopPhysDPDMaker_D2PD.py:

athena Example_TopPhysDPDMaker_D2PD.py &> D2PD.log
You can then use the checkFile.py utility to check that the CaloCalTopoCluster collection is now in your D2PD:
checkFile.py  Electroweak.D2PD.pool.root  &>D2PD.contents

Look at your D2PD.contents file and you will understand why in order to reach a 10% AOD size for the D1PD, this collection was dropped:

## opening file [Electroweak.D2PD.pool.root]...
## importing ROOT...
## importing ROOT... [DONE]
## opening file [OK]
File:Electroweak.D2PD.pool.root
Size:     1139.470 kb
Nbr Events: 10

================================================================================
     Mem Size       Disk Size        Size/Evt      items  (X) Container Name (X=Tree|Branch)
================================================================================
     145.210 kb        0.000 kb        0.000 kb       10  (T) DataHeader
--------------------------------------------------------------------------------
     106.726 kb        0.000 kb        0.000 kb       10  (B) ElectronContainer_p1_ElectronAODCollection
      63.666 kb        0.000 kb        0.000 kb       10  (B) PhotonContainer_p1_PhotonAODCollection
      53.986 kb        0.000 kb        0.000 kb       10  (B) TauJetContainer_p1_AtlfastTauJet1p3pContainer
      50.448 kb        0.000 kb        0.000 kb       10  (B) TauJetContainer_p1_AtlfastTauJetContainer
      73.368 kb        0.000 kb        0.000 kb       10  (B) TauJetContainer_p1_Tau1P3PContainer
      79.288 kb        0.000 kb        0.000 kb       10  (B) TauJetContainer_p1_TauRecContainer
       8.779 kb        0.000 kb        0.000 kb       10  (B) MissingET_p1_MET_RefFinal
     312.497 kb        0.000 kb        0.000 kb       10  (B) TauDetailsContainer_tlp1_Tau1P3PDetailsContainer
     311.751 kb        0.000 kb        0.000 kb       10  (B) TauDetailsContainer_tlp1_TauRecDetailsContainer
     105.940 kb        0.000 kb        0.000 kb       10  (B) MuonContainer_p1_CaloMuonCollection
     103.398 kb        0.000 kb        0.000 kb       10  (B) MuonContainer_p1_MuidMuonCollection
     105.516 kb        0.000 kb        0.000 kb       10  (B) MuonContainer_p1_StacoMuonCollection
      62.622 kb        0.000 kb        0.000 kb       10  (B) egDetailContainer_p1_egDetailAOD
       2.058 kb        0.000 kb        0.000 kb       10  (B) MuonSpShowerContainer_p1_MuonSpShowers
     231.486 kb        0.000 kb        0.000 kb       10  (B) Rec::TrackParticleContainer_tlp1_MuonboyMuonSpectroOnlyTrackParticles
     229.926 kb        0.000 kb        0.000 kb       10  (B) Rec::TrackParticleContainer_tlp1_StacoTrackParticles
      25.933 kb        0.000 kb        0.000 kb       10  (T) Trk::MVFVxContainer_tlp1
     361.583 kb        0.000 kb        0.000 kb       40  (T) POOLContainer_TauDetailsContainer_tlp1
      88.916 kb        0.000 kb        0.000 kb       40  (T) POOLContainer_TauJetContainer_p1
      78.177 kb        1.058 kb        0.106 kb       10  (B) EventInfo_p2_McEventInfo
     118.611 kb        6.828 kb        0.683 kb       10  (B) ParticleJetContainer_p1_Cone4H1TopoParticleJets
     113.652 kb       15.072 kb        1.507 kb       10  (B) ParticleJetContainer_p1_Cone4H1TowerParticleJets
     161.068 kb       25.350 kb        2.535 kb       10  (B) ParticleJetContainer_p1_Kt6H1TowerParticleJets
     487.097 kb       28.580 kb        2.858 kb       10  (B) Rec::TrackParticleContainer_tlp1_TrackParticleCandidate
     334.299 kb       56.401 kb        5.640 kb       10  (B) Trk::VxContainer_tlp1_VxPrimaryCandidate
    1026.691 kb      205.808 kb       20.581 kb       10  (B) CaloClusterContainer_p2_CaloCalTopoCluster
================================================================================
    4842.692 kb      339.097 kb       33.910 kb       10  TOTAL (POOL containers)
================================================================================
## Bye.

Producing a D3PD with TopPhysDPDMaker?

TopPhysDPDMaker? allows to dumps flat ntuples or if we stick to the new terminology a D3PD. This part is not considered in this tutorial. At the same time, the making of the D3PD from the AOD presents several advantages like e.g. the trigger information for which you don't need to bother.

Running your own algorithm and dump a D2PD (with e.g. the di-jet mass) with ATHENA and with ROOT

In this section, we show how to create an algorithm, run it as an ATHENA algorithm and then, with AthenaROOTAccess?, repeat the same thing in ROOT.
To illustrate this, we will write a small algorithm, which given a jet container, selects jets with given pT and eta values and finds the two closest jets in delta R space.

Create your algorithm

The creation of an ATHENA algorithm is described in more details in this Twiki. Simply follow the few command lines below:


#You need to do all this in your ATLAS TEST AREA!!!!!
cd $HOME/athena/13.0.40

# create a new package called MyNewPackage
cmt create MyNewPackage MyNewPackage-00-00-01

# the following directory will contain our 
mkdir $HOME/athena/13.0.40/MyNewPackage/*/share/

mkdir $HOME/athena/13.0.40/MyNewPackage/*/run/

mkdir -p $HOME/athena/13.0.40/MyNewPackage/*/src/components

cd $HOME/athena/13.0.40/MyNewPackage/*/cmt/

# copy the requirement file containing list of required packages
cp -f /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/MyNewPackage/requirements .

cd $HOME/athena/13.0.40/MyNewPackage/*/src/

# copy the algorithm MyAlg.cxx
cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/MyNewPackage/MyAlg.cxx .

# copy this class called WBosonBuilder
cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/MyNewPackage/WBosonBuilder.cxx .

cd $HOME/athena/13.0.40/MyNewPackage/*/src/components
cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/MyNewPackage/MyNewPackage_entries.cxx .
cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/MyNewPackage/MyNewPackage_load.cxx .

# copy all header files
cd $HOME/athena/13.0.40/MyNewPackage/*/MyNewPackage
cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/MyNewPackage/MyAlg.h .
cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/MyNewPackage/WBosonBuilder.h .

# in order to be able to have our software run with ROOT as well, we need to have this file containing all header files which will be used for the LCG dictionary generation!
cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/MyNewPackage/MyNewPackageDict.h .

# copy the xml file containing all header files. this XML file is then used by lcgdict for dictionnary generation!
cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/MyNewPackage/selextion.xml .


# ask CMT for configuration and script generation
cd $HOME/athena/13.0.40/MyNewPackage/*/cmt/
cmt config
source setup.sh
make

While the algorithm is compiling, in the meanwhile, let's look at our requirements file and see what it contains such that the WBosonBuilder object is callable from ROOT!

      1 package MyNewPackage
      2 
      3 use AtlasPolicy                  AtlasPolicy-01-*
      4 use GaudiInterface           GaudiInterface-01-*         External
      5 use JetTagEvent                 JetTagEvent-*                   PhysicsAnalysis/JetTagging       ---> to access the ParticleJet and ParticleJetContainer classes
      6 use FourMom                    FourMom-*                       Event                                           ---> to access P4Help class
      7 use FourMomUtils             FourMomUtils-*                Event
      8 
      9 library MyNewPackage *.cxx -s=components *.cxx
     10 apply_pattern component_library
     11 
     12 apply_pattern declare_joboptions files="MyJobOptions.py"                                       ---> Name of our python job-option file
     13 
     14 # In order to be able to load all this in a ROOT session
     15 private
     16 use AtlasReflex   AtlasReflex-00-*   External -no_auto_imports                                   --->  Load genreflex for LCG dictionary  (lcgdict) file for each header file
     17 apply_pattern lcgdict dict=MyNewPackage selectionfile=selection.xml headerfiles="../MyNewPackage/MyNewPackageDict.h"   --> here the list of files for which we want to have the LCG dictionnaries
     18 end_private

genreflex/lcgdict is the tool for LCG dictionnary generation, it is interfaced to Atlas through the AtlasReflex? package. In the MyNewPackageDict.h, we put the header files for the objects we want to include in the MyNewPackage.so and MyNewPackageDict.so which will be loaded in ROOT (to know more: LCGDictionary.
The selection.xml is the class selection file used to specify for which classes the dictionaries must be generated.

If compilation is still not finished, you can edit the source files for MyAlg and the WBosonBuilder classes which select jets with given pT and eta cuts and see how they are written.

If compilation ran smoothly, we can then see how to run this algorithm with ATHENA and then see how we can use the WBosonBuilder with ROOT!

Run your algorithm with ATHENA

We want to run over one Atlfast AOD the MyAlg which is an interface to the WBosonBuilder. This is done through the MyJobOptions.py in which we instanciate the MyAlg. Let's copy it and look at it:

cd $HOME/athena/13.0.40/MyNewPackage/*/share/
cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/MyJobOptions.py .
The MyJobOptions.py file looks like:
      1 
      2 #------------------------------------------------------------------------
      3 # import the data types
      4 import EventKernel.ParticleDataType
      5 
      6 #------------------------------------------------------------------------
      7 # get a handle on the ServiceManager which holds all the services
      8 from AthenaCommon.AppMgr import ServiceMgr
      9 
     10 # the converters
     11 import AthenaPoolCnvSvc.ReadAthenaPool
     12 include ( "ParticleBuilderOptions/ESD_PoolCnv_jobOptions.py" )
     13 include ( "ParticleBuilderOptions/AOD_PoolCnv_jobOptions.py" )
     14 include ( "ParticleBuilderOptions/McAOD_PoolCnv_jobOptions.py" )
     15 include ( "EventAthenaPool/EventAthenaPool_joboptions.py" )
     16 
     17 #------------------------------------------------------------------------
     18 # give the library
     19 theApp.Dlls   += ["MyNewPackage"]
     20 theApp.Dlls   += ["MyNewPackageDict"]
     21 
     22 #------------------------------------------------------------------------
     23 # Full job is a list of algorithms
     24 from AthenaCommon.AlgSequence import AlgSequence
     25 TopSeq = AlgSequence()
     26 
     27 # Add top algorithms to be run
     28 from MyNewPackage.MyNewPackageConf import MyAlg
     29 WBoson = MyAlg("WBoson")
     30 WBoson.OutputLevel = INFO
     31 WBoson.InputParticleJetContainerName = 'AtlfastParticleJetContainer'
     32 WBoson.OutputHadronicWBosonContainerName='myWBoson'
     33 WBoson.CutJetPt  = 30 * GeV
     34 WBoson.CutJetEta = 2.5
     35 #------------------------------------------------------------------------
     36 # add this algorithm to the algorithm sequence
     37 TopSeq += [WBoson]
     38 
     39 #------------------------------------------------------------------------
     40 # Set output level threshold (2=DEBUG, 3=INFO, 4=WARNING, 5=ERROR, 6=FATAL )
     41 from AthenaCommon.AppMgr import ServiceMgr
     42 ServiceMgr.MessageSvc = Service( "MessageSvc" )
     43 ServiceMgr.MessageSvc.OutputLevel = DEBUG
     44 ServiceMgr.MessageSvc.Format = "% F%75W%S%7W%R%T %0W%M"
     45 ServiceMgr.MessageSvc.defaultLimit=10000000
     46 
     47 #------------------------------------------------------------------------
     48 ServiceMgr.EventSelector.InputCollections = ['AOD.pool.root'];
     49 
     50 #------------------------------------------------------------------------
     51 # Number of Events to process
     52 theApp.EvtMax = -1

At line 29, we instanciate the algorithm WBoson = MyAlg?("WBoson"), and then we set its properties: message level, input jet container name, container name for the hadronic W-bosons which will be reconstructed and which will be stored in StoreGate. We set the two values for the pT and eta cuts for the jet skimming.

     29 WBoson = MyAlg("WBoson")
     30 WBoson.OutputLevel = INFO
     31 WBoson.InputParticleJetContainerName = 'AtlfastParticleJetContainer'
     32 WBoson.OutputHadronicWBosonContainerName='myWBoson'
     33 WBoson.CutJetPt  = 30 * GeV
     34 WBoson.CutJetEta = 2.5

Let's run it to check that it works fine.

cd $HOME/athena/13.0.40/MyNewPackage/*/share/
cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/MyNewPackage/MyJobOptions.py .
cd $HOME/athena/13.0.40/MyNewPackage/*/run/

# run it
athena  ../share/MyJobOptions.py &> athena.log
You can have a look at the athena.log file to see that your algorithm runs nicely and reconstructs a jet from the two closest jets in the eta,phi space.

Now, we would like to store this object in a DPD and then view the mass distribution with ROOT. This is explained in the next section.

Run your algorithm with ATHENA and produce a DPD which contains the di-jet invariant mass

Test: Now, we would like to rerun the algorithm above, but this time, we would like to store in a DPD the reconstructed hadronic W boson candidates!

Solution: This is rather easy, we simply need to add to the previous job option file the Athena Pool Output Stream as we did previously and add our container to the ItemList:

from AthenaPoolCnvSvc.WriteAthenaPool import AthenaPoolOutputStream
StreamDPD = AthenaPoolOutputStream( "StreamDPD" )

# everything which is an EventInfo object
StreamDPD.ItemList  =  ['EventInfo#*']

#- my di-Jet mass
StreamDPD.ItemList += ['ParticleJetContainer#myWBoson']

StreamDPD.ForceRead= TRUE
StreamDPD.OutputFile= 'DPD.pool.root'

Again, you can check that the produced DPD contains the myWBoson container using checkFile.py
Test: Store in the DPD two dijet masses, the first one called diJetMass40 for a jet pT >40 GeV? and the second one for a jet pT>30 GeV? and labeled diJetMass30

Viewing the di-jet invariant mass with ROOT

In this section, we show how, by using the AthenaROOTAccess, we can view the distribution for the di-jet mass.
Let's first install AthenaROOTAccess?. For release 13.0.40, we will use the recommended tag AthenaROOTAccess-00-00-38-09

cd $HOME/athena/13.0.40
cmt co -r AthenaROOTAccess-00-00-38-09  PhysicsAnalysis/AthenaROOTAccess
cd $HOME/athena/13.0.40/PhysicsAnalysis/AthenaROOTAccess/*/cmt
source setup.sh
make

Once compilation is finished, we will within a ROOT session, create an histogram and read the transient TTree which is created by executing a python script provided with AthenaROOTAccess-00-00-38-09 and called test.py. Let's get it:

cd $HOME/athena/13.0.40/MyNewPackage/*/run/
get_files test.py

You need then to fix the name of the DPD file in this python script. Let's do it:

mv test.py loadDPD.py
sed -i 's/AOD.pool.root/DPD.pool.root/' loadDPD.py
Let's then load the script in ROOT

// start ROOT
root -l

// load the DPD using the TPython module
// if you want to do this with something written in C++, you would need to rewrite the AthenaROOTAccess/python/transientTree.py in C++
TPython::ExecScript("loadDPD.py")

// list the current directory contents: you should see the CollectionTree_trans
gDirectory->ls();

// get the Transient TTree which name is CollectionTree_trans (can be changed through the python script above)
TTree *t = (TTree*)gDirectory->Get("CollectionTree_trans");

// Now let's print the TTree: you should see themyWBoson  container!
t->Print();

// let's get the TBranch
TBranch *bPJC =  (TBranch*)t->GetBranch("myWBoson");

// create a ParticleJetContainer variable
ParticleJetContainer *PJC = new ParticleJetContainer;

// attach the ParticleJetContainer to it
bPJC->SetAddress(&PJC);

// let's get one entry
bPJC->GetEntry(0);

// get the reconstructed particle built from the two closest jets in eta, phi
ParticleJet *PJ  = (*PJC)[0];

// Print  the mass for the reconstructed invariant di-jet mass
std::cout << PJ->m() << std::endl;

Exercice: create an histogram h1_mass, fill it with the reconstructed mass and draw it!
Solution:

// start ROOT
root -l

// load the DPD using the TPython module
// if you want to do this with something written in C++, you would need to rewrite the AthenaROOTAccess/python/transientTree.py in C++
TPython::ExecScript("loadDPD.py")

// list the current directory contents: you should see the CollectionTree_trans
gDirectory->ls();

// get the Transient TTree which name is CollectionTree_trans (can be changed through the python script above)
TTree *t = (TTree*)gDirectory->Get("CollectionTree_trans");

// Now let's print the TTree: you should see the myWBoson  container!
t->Print();

// let's get the TBranch
TBranch *bPJC =  (TBranch*)t->GetBranch("myWBoson");

// create a ParticleJetContainer variable
ParticleJetContainer *PJC = new ParticleJetContainer;

// attach the ParticleJetContainer to it
bPJC->SetAddress(&PJC);

// let's print the total number of entries
std::cout << t->GetEntriesFast() << std::endl;

// create the TH1F histogram
TH1F *h1_jj_mass =  new TH1F("h1_jj_mass","",100,0,500);

for (int event=0; event <  t->GetEntriesFast(); event++){ bPJC->GetEntry(event); if (PJC->size()>0) h1_jj_mass->Fill( ((*PJC)[0])->m()/1000);}

// Draw the distribution
h1_jj_mass->Draw();

Run your algorithm with ROOT

In this section, we would like to show how the class WBosonBuilder can be used within a ROOT session to analyse an AOD or a D1PD. As we have show it before, adding in the requirements file these lines:
     15 private
     16 use AtlasReflex   AtlasReflex-00-*   External -no_auto_imports                                   --->  Load genreflex for LCG dictionary file for each header file
     17 apply_pattern lcgdict dict=MyNewPackage selectionfile=selection.xml headerfiles="../MyNewPackage/MyNewPackageDict.h"   --> here the list of files for which we want to have the LCG dictionnaries
     18 end_private
the LCG dictionnary for all header files defined in selection.xml will be produced (genreflex and thus these objects will be known to ROOT. Then, In order to access the WBosonBuilder object in ROOT, one needs simply to load the MyNewPackage.so and MyNewPackageDict.so shared objects.
Now, let's redo it with ROOT. But first let's modify the test.py file to load the AOD:
mv test.py loadAOD.py

Edit loadAOD.py and change AOD.pool.root to rfio:cchpssatlas.in2p3.fr://hpss/in2p3.fr/home/g/ghodban/CCIN2P3-27052008/AOD/mc13.005200.10TeV.atlfast.AOD._00200.v13004002.pool.root

Now let's run on the AOD with the WBosonBuilder:


root -l
// Load first the AOD and extract the transient CollectionTree
TPython::ExecScript("loadAOD.py");

//  Get the transient TTree
TTree *t = (TTree*)gDirectory->Get("CollectionTree_trans");

// Load the shared object and its dictionnary to get access to WBosonBuilder
// don't worry about paths since the LD_LIBRARY_PATH includes the $HOME/athena/13.0.40/InstallArea/lib (check it!)

gSystem->Load("libMyNewPackage.so");
gSystem->Load("libMyNewPackageDict.so");

// Create a WBosonBuilder object!
WBosonBuilder *wBuilder = new WBosonBuilder();

// let's get the TBranch
TBranch *bPJC =  (TBranch*)t->GetBranch("AtlfastParticleJetContainer");

// create a ParticleJetContainer variable
ParticleJetContainer *PJC = new ParticleJetContainer;

// attach the ParticleJetContainer to it
bPJC->SetAddress(&PJC);

// get the first event
bPJC->GetEntry(0);

// how many jets do we have in this first event
std::cout << PJC->size() << std::endl;

// use the WBosonBuilder to  select all jets which have a pT>30GeV and |eta|<3.0
ParticleJetContainer* sPJC = wBuilder->doPreselection(PJC,30000,3.0)

// print the number of selected jets!
std::cout << sPJC->size() << std::endl;

// let's now use the WBosonBuilder::build method to reconstruct the two jets the closest!
ParticleJetContainer* WC = wBoson->build(sJC);

// print the reconstructed invariant mass!
ParticleJet *W = (*WC)[0];
std::cout << W->m() << std::endl;

Now you know how, with ROOT, to re-use the same software!

Exercice: starting from what we did above, write a small ROOT macro which loops over all entries and displays the reconstructed invariant di-jet mass!

Solution: you can find the solution at this location


cd $HOME/athena/13.0.40/MyNewPackage/*/run/

# copy the macro
cp  /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/diJetMass.C .

# copy the python script for loading the AOD
cp  /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/loadAOD.py .

Then, you can run this macro:

root -l

gROOT->LoadMacro("diJetMass.C")

// call the diJetMass macro
diJetMass()

Exercice: Analysing FDR AODs with AthenaROOTAccess?

We have shown how to run on an AOD or a DPD with AthenaROOTAccess?. In this section, we show how to run on the FDR08 stream Muon and produce the di-muon invariant mass spectrum.

Exercice:: from the fdr08_run1.0003070.StreamMuon.merge.AOD.o1_r12_t1._0001.1:

  • write a muon filter which requires the event to have at least two StacoMuonCollection muons with pT>10 GeV? and |eta| <3.

  • write a small job option file which creates a DPD with the previous filter!

Solution: you can find the two scripts in this directory:

# the muon filter
cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/muon_filter.py .

cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/fdr_AODtoDPD.py .

Now that we have our DPD, let's analyse it with ROOT and find events with two muons of opposite charge only.

Exercice: write a ROOT macro which given the transient TTree produced with the AthenaROOTAccess? test.py script,

* loops over all entries

* keeps events with only two muons of opposite charge

* plots the invariant mass of these two muons

Solution: copy the macro from this directory:

cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/Z0toMuMuAnalysis.C .
cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/loadFDR.py .

Analyse the DPD (or the AOD!):

root -l

// load the Macro
gROOT->LoadMacro("Z0toMuMuAnalysis.C");

// get the TTree
TTree *t = LoadTree();

// produce the di-muon invariant mass
Z0toMuMuAnalysis(t);

Tip: You can at this level, write a class like WBosonBuilder and call it in ROOT and compare timing!

Conclusions

With all this tips, you should now be able to run on the coming data!

Some words on what not covered in this tutorial:

  • GRID tools: the explanations given on the ATLAS twiki are self explanatory enough, such that you can run

  • More sophisticated tools for Top Physics: You can have a look at what is being done in the ARATopQuarkAnalysis package. We are currently working on an interface with the analysis package developed at LPSC (Grenoble).

FAQ

tcsh or bash?

in the past several people experienced a "word too long" error when using tcsh and this because the version of the used tcsh has a limited length in the word length. I don't know what the status for more recent versions is, but the easiest solution that i found at that time, was to simply change my shell to bash. So change it to bash and your life will be much easier

I cannot produce the Generator pool Mcatnlo event file!

The decoding of the mc13.005200.dat requires a special formatting! a typical failure is due to the fact that the Mcatnlo_i interface, fails to decode this file. a missing or an additional blank character is most of the time the reason for which it will fail. thus, stay in overwrite mode if using emacs when changing things in mc13.005200.dat.

Can you provide me with an already produced Atlfast AOD at 10 TeV?

Even if, by following this tutorial, you will be able to produce an ATLFAST AOD, I put one AOD on my CASTOR area at CERN:

/castor/cern.ch/user/g/ghodbane/CCIN2P3-27052008/mc13.005200.10TeV.atlfast.AOD._00200.v13004002.pool.root

Note, that if you need more statistics, you can have a look at this CASTOR location as well:

rfdir /castor/cern.ch/grid/atlas/tzero/atlasusertape/user/ghodbane/user.NabilGhodbane.mc13.005200.10TeV.atlfast.AOD.v13004002.1
>
>
http://atlas-france.in2p3.fr/cgi-bin/twiki/bin/view/Atlas/ARATutorialCCIN2P3AthenaRelease14
 


Line: 1308 to 78
 -- ghodban@IN2P3.FR - 26 May 2008
Deleted:
<
<
 
META FILEATTACHMENT attachment="CCIN2P3-27052008.tar.gz" attr="" comment="Material: contents of the CCIN2P3-27052008 directory" date="1212135033" name="CCIN2P3-27052008.tar.gz" path="CCIN2P3-27052008.tar.gz" size="598655" stream="CCIN2P3-27052008.tar.gz" user="ghodban@IN2P3.FR" version="1"

Revision 918 Jun 2008 - Main.FR

Line: 1 to 1
 

Goals of this tutorial

Line: 1226 to 1226
 

I cannot produce the Generator pool Mcatnlo event file!

The decoding of the mc13.005200.dat requires a special formatting! a typical failure is due to the fact that the Mcatnlo_i interface, fails to decode this file. a missing or an additional blank character is most of the time the reason for which it will fail. thus, stay in overwrite mode if using emacs when changing things in mc13.005200.dat.
Added:
>
>

Can you provide me with an already produced Atlfast AOD at 10 TeV?

Even if, by following this tutorial, you will be able to produce an ATLFAST AOD, I put one AOD on my CASTOR area at CERN:

/castor/cern.ch/user/g/ghodbane/CCIN2P3-27052008/mc13.005200.10TeV.atlfast.AOD._00200.v13004002.pool.root

Note, that if you need more statistics, you can have a look at this CASTOR location as well:

rfdir /castor/cern.ch/grid/atlas/tzero/atlasusertape/user/ghodbane/user.NabilGhodbane.mc13.005200.10TeV.atlfast.AOD.v13004002.1
 


Revision 805 Jun 2008 - Main.FR

Line: 1 to 1
 

Goals of this tutorial

Line: 21 to 21
 

Setting up ATHENA 13.0.40

Added:
>
>
The settings below are of course meant for people who have an account on the CCIN2P3 machines! For an ATHENA kit or lxplus, simply follow the usual explanations on how to setup your ATHENA environment on this twiki at CERN: WorkBookSetAccount.
 
# connect to a SLC3 32-bit node
ssh -X ccalisl3.in2p3.fr

Revision 704 Jun 2008 - Main.FR

Line: 1 to 1
 

Goals of this tutorial

Line: 25 to 25
 # connect to a SLC3 32-bit node ssh -X ccalisl3.in2p3.fr
Added:
>
>
# you can connect to the SLC4 32-bit node as well, but then below, you need to make sure that you use the slc4 tag!!!!! ssh -X ccalisl432.in2p3.fr
 # create your ATHENA working area mkdir -p $HOME/athena/13.0.40/cmt cd $HOME/athena/13.0.40/cmt
Line: 41 to 44
 # each time you login and want to setup your ATHENA 13.0.40 with AtlasProduction?-13.0.40.3 release, do: source ~/athena/13.0.40/cmt/setup.sh -tag=13.0.40.3,32,AtlasProduction,slc3
Added:
>
>
#if you connected to an SLC4 32 bit node, use slc4 as tag!!!! source ~/athena/13.0.40/cmt/setup.sh -tag=13.0.40.3,32,AtlasProduction,slc4
 # you can check that things are consistent, by printing the CMTPATH variable. For instance, user ghodban should see: echo $CMTPATH /afs/in2p3.fr/home/g/ghodban/athena/13.0.40:/afs/in2p3.fr/group/atlas/sw/prod/releases/rel_13-4/AtlasProduction/13.0.40.3

Revision 630 May 2008 - Main.FR

Line: 1 to 1
 

Goals of this tutorial

Line: 1215 to 1215
 

tcsh or bash?

in the past several people experienced a "word too long" error when using tcsh and this because the version of the used tcsh has a limited length in the word length. I don't know what the status for more recent versions is, but the easiest solution that i found at that time, was to simply change my shell to bash. So change it to bash and your life will be much easier
Added:
>
>

I cannot produce the Generator pool Mcatnlo event file!

The decoding of the mc13.005200.dat requires a special formatting! a typical failure is due to the fact that the Mcatnlo_i interface, fails to decode this file. a missing or an additional blank character is most of the time the reason for which it will fail. thus, stay in overwrite mode if using emacs when changing things in mc13.005200.dat.
 

Revision 530 May 2008 - Main.FR

Line: 1 to 1
 

Goals of this tutorial

Line: 1285 to 1285
 

-- ghodban@IN2P3.FR - 26 May 2008

Added:
>
>

META FILEATTACHMENT attachment="CCIN2P3-27052008.tar.gz" attr="" comment="Material: contents of the CCIN2P3-27052008 directory" date="1212135033" name="CCIN2P3-27052008.tar.gz" path="CCIN2P3-27052008.tar.gz" size="598655" stream="CCIN2P3-27052008.tar.gz" user="ghodban@IN2P3.FR" version="1"

Revision 428 May 2008 - Main.FR

Line: 1 to 1
Added:
>
>
 

Goals of this tutorial

The ATLAS analysis data format is evolving...This tutorial aims to introduce in a simplified and clear manner the recently introduced new data format: the Derived Physics Dataset (DPD).

Line: 650 to 652
 Then you will be able to run the D2PD production tool. You can also simply copy this file from this directory:
Changed:
<
<
cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/MetaData_jobOption.py $HOME/athena/13.0.40/PhysicsAnalysis/TopPhys/TopPhysDPDMaker/*/share/.
>
>
cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/MetaData_jobOptions.py $HOME/athena/13.0.40/PhysicsAnalysis/TopPhys/TopPhysDPDMaker/*/share/.
 

You can copy this script from the tutorial repository:

Revision 327 May 2008 - Main.FR

Line: 1 to 1
 

Goals of this tutorial

The ATLAS analysis data format is evolving...This tutorial aims to introduce in a simplified and clear manner the recently introduced new data format: the Derived Physics Dataset (DPD).

Line: 596 to 596
 cmt co -r HighPtView?-00-01-11 PhysicsAnalysis?/HighPtPhys/HighPtView cmt co -r HEAD PhysicsAnalysis?/TopPhys/TopPhysUtils/HitFit
Changed:
<
<
and follow the explanations given on this Twiki
>
>
and follow the explanations given on this Twiki for setting up the group area
 

Producing a D2PD with TopPhysDPDMaker?

Line: 613 to 613
 
Added:
>
>
If you run this script, it will crash with the following error message:

IOVDbMgr            ERROR Unable to get default connection to COOL Conditions database.
IOVDbMgr            ERROR    Please set job option: IOVDbSvc.dbConnection  = <db connection string>
IOVDbSvc            ERROR Unable to get dbConnection - empty connection string 
ServiceManager      ERROR Unable to initialize Service: DetectorStore

This is somehow consistent with what was reported on the hypernews forums. As explained on the TopphysDPDMaker? twiki, release 13 is a transition release since:

  • 13.0.30 -> no dB, but trigger config present event-by-event in the AOD

  • fdr -> dB exists, plus we have DataStore?? metaheader

  • 13.0.40 is a sort of transition, there's no dB, no AOD info, no DataStore??

The suggested solution is to simply drop the trigger part as this is suggested in the D3PD case. To do this for the D2PD case, you need to edit TopPhysDPDMaker/*/share/MetaData_jobOptions.py and comment out these lines:

# Metadata Info for Trigger Configuration
# import IOVDbSvc.IOVDb
# svcMgr.IOVDbSvc.Folders += [ "/TRIGGER/HLT/Menu" ]
# svcMgr.IOVDbSvc.Folders += [ "/TRIGGER/HLT/HltConfigKeys" ]
# svcMgr.IOVDbSvc.Folders += [ "/TRIGGER/LVL1/Lvl1ConfigKey" ]
# svcMgr.IOVDbSvc.Folders += [ "/TRIGGER/LVL1/Menu" ]
# svcMgr.IOVDbSvc.Folders += [ "/TRIGGER/LVL1/Prescales" ]

# svcMgr.IOVDbSvc.FoldersToMetaData += [ "/TRIGGER/HLT/Menu" ]
# svcMgr.IOVDbSvc.FoldersToMetaData += [ "/TRIGGER/HLT/HltConfigKeys" ]
# svcMgr.IOVDbSvc.FoldersToMetaData += [ "/TRIGGER/LVL1/Lvl1ConfigKey" ]
# svcMgr.IOVDbSvc.FoldersToMetaData += [ "/TRIGGER/LVL1/Menu" ]
# svcMgr.IOVDbSvc.FoldersToMetaData += [ "/TRIGGER/LVL1/Prescales" ]

Then you will be able to run the D2PD production tool. You can also simply copy this file from this directory:

cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/MetaData_jobOption.py $HOME/athena/13.0.40/PhysicsAnalysis/TopPhys/TopPhysDPDMaker/*/share/.
 You can copy this script from the tutorial repository:
# go the TopPhysDPDMaker test area
Line: 1170 to 1210
 

FAQ

Added:
>
>

tcsh or bash?

in the past several people experienced a "word too long" error when using tcsh and this because the version of the used tcsh has a limited length in the word length. I don't know what the status for more recent versions is, but the easiest solution that i found at that time, was to simply change my shell to bash. So change it to bash and your life will be much easier
 

Revision 227 May 2008 - Main.FR

Line: 1 to 1
 

Goals of this tutorial

The ATLAS analysis data format is evolving...This tutorial aims to introduce in a simplified and clear manner the recently introduced new data format: the Derived Physics Dataset (DPD).

Line: 713 to 713
 The creation of an ATHENA algorithm is described in more details in this Twiki. Simply follow the few command lines below:
Added:
>
>
#You need to do all this in your ATLAS TEST AREA!!!!!
 cd $HOME/athena/13.0.40

# create a new package called MyNewPackage?

Line: 750 to 752
 # in order to be able to have our software run with ROOT as well, we need to have this file containing all header files which will be used for the LCG dictionary generation! cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/MyNewPackage/MyNewPackageDict.h .
Added:
>
>
# copy the xml file containing all header files. this XML file is then used by lcgdict for dictionnary generation! cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/MyNewPackage/selextion.xml .
 # ask CMT for configuration and script generation cd $HOME/athena/13.0.40/MyNewPackage/*/cmt/ cmt config

Revision 126 May 2008 - Main.FR

Line: 1 to 1
Added:
>
>

Goals of this tutorial

The ATLAS analysis data format is evolving...This tutorial aims to introduce in a simplified and clear manner the recently introduced new data format: the Derived Physics Dataset (DPD).

Questions like: what is a DPD, what does it contain, how can i create my own DPD, how can i run centrally maintained tools for DPD production and finally how can i analyse with both ATHENA and ROOT a DPD are addressed.

This tutorial is meant for ATHENA release 13.0.40 only (expect maybe for the event generation and simulation parts). Thus it will be updated accordingly for release 14.0.X.

Moreover, a FAQ section with redundant issues will be maintained.

Setting up your ATHENA session at CCIN2P3 for release 13.0.40

The following settings are described in more details at this URL: GuideAthena twiki.

In all what follows, we assume that your default shell is bash (check your $SHELL environment variable). If you're using tcsh, customise the shell commands accordingly.

Finally, in this tutorial, we use $HOME/athena/13.0.40 as ATLAS_TEST_AREA. If you've got disk space issues, and if you've a $GROUP_DIR/$LOGNAME area, change in all what follows the $HOME variable to $GROUP_DIR/$LOGNAME

Setting up ATHENA 13.0.40

# connect to a SLC3 32-bit node
ssh -X ccalisl3.in2p3.fr

# create your ATHENA working area
mkdir -p $HOME/athena/13.0.40/cmt
cd $HOME/athena/13.0.40/cmt

# copy this requirement file
cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/requirements .

# source this script only once!
source /afs/in2p3.fr/group/atlas/sw/prod/releases/rel_13-4/CMT/v1r20p20070720/mgr/setup.sh

# configure and produce the setup scripts (only once!)
cmt config

# each time you login and want to setup your ATHENA 13.0.40 with AtlasProduction-13.0.40.3  release, do:
source ~/athena/13.0.40/cmt/setup.sh -tag=13.0.40.3,32,AtlasProduction,slc3

# you can check that things are consistent, by printing the CMTPATH variable. For instance, user <tt>ghodban</tt> should see:
echo $CMTPATH
/afs/in2p3.fr/home/g/ghodban/athena/13.0.40:/afs/in2p3.fr/group/atlas/sw/prod/releases/rel_13-4/AtlasProduction/13.0.40.3

Setting up the ATLAS CVS repository

We will need to check out some packages. CCIN2P3 hosts a CVS mirror of the ATLAS software. Here are the basic settings required to get it work.
Create or append your .ssh/config file (read by ssh for configuration) with:
   
Host anoncvs.in2p3.fr
User atlascvs
Port 2222    
PubkeyAuthentication no
RSAAuthentication no
PasswordAuthentication yes
ForwardX11 no
You can get this file from this location:
cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/config ~/.ssh/.

Don't forget to setup these two environment variables to get CVS work (use setenv if your $SHELL is tcsh):

export CVSROOT=atlascvs@anoncvs.in2p3.fr:/atlascvs
export CVS_RSH=ssh
export CMTCVSOFFSET=offline

Now, you should be able to check out packages.

Producing a MCatnlo Atlfast AOD

before going further, we are going to spend some time to show how to use the Mcatnlo program for event generation at 10 TeV?. This is done in three steps:

Installing Mcatnlo 3.2

first, install the Mcatnlo event generator. For disk space issues, we assume that you've got a $GROUP_DIR$ area:
cd $GROUP_DIR/$LOGNAME

In this directory, we download the version 3.2 of the Mcatnlo program.

mkdir mcatnlo
cd mcatnlo
# get the 3.2 version
wget http://www.hep.phy.cam.ac.uk/theory/webber/MCatNLO/Package32.tar.gz

# untar
tar zxvf Package32.tar.gz

In order to use Mcatnlo with the les Houches Accord PDF LHAPDF, we need to:

  • do some changes to the McatNLO.input file to tell the installation script where the LHAPDF distributed with the ATLAS 13.0.40 kit is located, where the HERWIG include files are, etc...

  • do some changes to the Makefile, mcatnlo_uti.f and mcatnlo_lhauti.f files such that we avoid clashes between Fortran function names...

Simply copy the files for which these issues are fixed:

cd $GROUP_DIR/$LOGNAME/mcatnlo
cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/mcatnlo32/* .
Now you're ready to compile and then run Mcatnlo
./MCatNLO.input
Running this script, you will automatically create a directory called Linux (this name can be different if you're using another Operating System) in which you will find the ttbNLO_EXE_LHAPDF ready for use.

Producing t%bar{t} event with Mcatnlo 3.2

Let's run it. To do this, create one file parameters like:


 'mc13.005200'       ! prefix for BASES files
 'mc13.005200'       ! prefix for event files
 10000 1 1 1 1 ! energy, fren, ffact, frenmc, ffactmc
 -1706                          ! -1705/1706=bb/tt
 173                        ! M_Q
 0.32 0.32 0.5 1.55 4.95 0.75 ! quark and gluon masses
 'P'  'P'               ! hadron types
 'LHAPDF'   10000            ! PDF group and id number
  -1                     ! Lambda_5, <0 for default
 'MS'                   ! scheme
  10000                          ! number of events
  1                        ! 0 => wgt=+1/-1, 1 => wgt=+w/-w
  0                      ! seed for rnd numbers
  0.3                             ! zi
 10 10                 ! itmx1,itmx2

Then simply run:
cd $GROUP_DIR/$LOGNAME/mcatnlo/Linux

ttbNLO_EXE_LHAPDF < ../mc13.005200.input

This will produce an event file mc13.005200.events ready to be used with ATHENA. The preparation of this file is described in the next section.

Producing Generator Pool Files (Parton-shower/Hadronization/Photos/Tauola/Jimmy)

The Mcatnlo event file mc13.005200.events contains events ready to be processed with Herwig for parton shower and hadronisation and by Jimmy for underlying events. It is rather important to stick to ATLAS official tools.
Let's prepare the Mcatnlo event file first. We need to change the name for the LHAPDF used by the Mcatnlo_i interface:

cd  $GROUP_DIR/$LOGNAME/mcatnlo32/Linux
sed -i 's/LHAPDF    10000/HWLHAPDF  10000/' mc13.005200.events

Next, we need to prepare the Mcatnlo_i input parameter file:

 'mc13.005200.events'        ! event file
  1000                         ! number of events
  1                        ! 0->Herwig PDFs, 1 otherwise
 'P'  'P'               ! hadron types
  5000.00 5000.00               ! beam momenta
  -1706                          ! -1705/1706=bb/tt
 'HWLHAPDF'                      ! PDF group (1). It was MRS in the original ttMCinput file
  10000                       ! PDF id number (1). It was 105 in the original ttMCinput file
 'HWLHAPDF'                      ! PDF group (2). It was MRS in the original ttMCinput file
  10000                       ! PDF id number (2). It was 105 in the original ttMCinput file
  -1                     ! Lambda_5, < 0 for default
  173                        ! M_Q
  0.32 0.32 0.5 1.55 4.95 0.75 ! quark and gluon masses

You can copy it from this location:
cd $GROUP_DIR/$LOGNAME/mcatnlo/Linux/.
cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/mcatnlo32/mc13.005200.dat .

We then need to put the two files mc13.005200.dat and mc13.005200.events in an archive mc13.005200.tar.gz used by ATHENA.

cd  $GROUP_DIR/$LOGNAME/mcatnlo/Linux
tar zcvf mc13.005200.tar.gz mc13.005200.events mc13.005200.dat

We need to check-out this tag for which a fix is provided such that one can run at 10TeV centre of mass energy:

cd ~/athena/13.0.40
cmt co -r EvgenJobTransforms-00-06-04 Generators/EvgenJobTransforms
cd Generators/EvgenJobTransforms/*/cmt
source setup.sh
gmake

Let's prepare the input file for EvgenJobTransforms. First, we need to prepare the generic file needed by the ATHENA McatNLO_i interface .
First, get the ATHENA job option file which was used for the CSC production.
cd $GROUP_DIR/$LOGNAME/mcatnlo/Linux

# retrieve the CSC job option file and fix the input file name prefix (hard-coded)
get_files  CSC.005200.T1_McAtNlo_Jimmy.py
sed -i s"/mcatnlo31.005200.ttbar/mc13.005200/" CSC.005200.T1_McAtNlo_Jimmy.py

Then run it:

csc_evgen08_trf.py -t -l INFO runNumber=00001 firstEvent=1 maxEvents=1000 randomSeed=10000 jobConfig=CSC.005200.T1_McAtNlo_Jimmy.py outputEvgenFile=mc13.005200.gen.pool.root  inputGeneratorFile=mc13.005200.tar.gz  &>mc13.005200.log.1&

This should produce a generator POOL file, that one can then use for full simulation or fast simulation with Atlfast. This step is described in the next section.
Note that you can already check the contents of this generator POOL file by using the checkFile.py facility:

cd $GROUP_DIR/$LOGNAME/mcatnlo/Linux
checkFile.py mc13.005200.gen.pool.root &> pool.contents

## opening file [mc13.005200.gen.pool.root]...
## importing ROOT...
## importing ROOT... [DONE]
## opening file [OK]
File:mc13.005200.gen.pool.root
Size:    37173.304 kb
Nbr Events: 1115

================================================================================
     Mem Size       Disk Size        Size/Evt      items  (X) Container Name (X=Tree|Branch)
================================================================================
     796.464 kb       31.687 kb        0.028 kb     1115  (T) DataHeader
--------------------------------------------------------------------------------
     288.518 kb        7.508 kb        0.007 kb     1115  (B) EventInfo_p2_McEventInfo
   96966.967 kb    36760.912 kb       32.969 kb     1115  (B) McEventCollection_p3_GEN_EVENT
================================================================================
   98051.949 kb    36800.107 kb       33.005 kb     1115  TOTAL (POOL containers)
================================================================================
## Bye.

Producing Atlfast AODs

The production of Atlfast AODs is rather easy and straightforward when using csc_atlfast_trf.py .
csc_atlfast_trf.py ntupleFile=atlfast.root maxEvents=1000 skipEvents=0 outputAODFile=mc13.005200.atlfast.pool.root  inputEvgenFile=mc13.005200.gen.pool.root  &>mc13.005200.log.2&
We can check the contents of this Atlfast AOD to get familiar with the different container names for instance... To do this, simply use the checkFile.py script.

## opening file [mc13.005200.atlfast.pool.root]...
## importing ROOT...
## importing ROOT... [DONE]
## opening file [OK]
File:mc13.005200.atlfast.pool.root
Size:    36793.779 kb
Nbr Events: 1000

================================================================================
     Mem Size       Disk Size        Size/Evt      items  (X) Container Name (X=Tree|Branch)
================================================================================
    2916.702 kb      114.783 kb        0.115 kb     1000  (T) DataHeader
--------------------------------------------------------------------------------
     335.096 kb        0.000 kb        0.000 kb     1000  (B) ElectronContainer_p1_AtlfastElectronCollection
     273.462 kb        0.000 kb        0.000 kb     1000  (B) PhotonContainer_p1_AtlfastPhotonCollection
     292.480 kb        0.000 kb        0.000 kb     1000  (B) TauJetContainer_p1_AtlfastTauJetContainer
     605.093 kb        0.732 kb        0.001 kb     1000  (B) MuonContainer_p1_AtlfastNonIsoMuonCollection
      68.544 kb        1.343 kb        0.001 kb     1000  (B) TruthParticleContainer_p5_SpclMC
     652.330 kb        1.511 kb        0.002 kb     1000  (B) MuonContainer_p1_AtlfastMuonCollection
      70.099 kb        4.581 kb        0.005 kb     1000  (B) MissingET_p1_AtlfastMissingEt
     256.088 kb        7.508 kb        0.008 kb     1000  (B) EventInfo_p2_McEventInfo
     528.524 kb       64.473 kb        0.064 kb     1000  (B) TauJetContainer_p1_AtlfastTauJet1p3pContainer
    2130.056 kb      313.702 kb        0.314 kb     1000  (B) INav4MomAssocs_p2_AtlfastMcAodAssocs
    4189.289 kb      360.320 kb        0.360 kb     1000  (B) ParticleJetContainer_p1_Cone4TruthParticleJets
    4209.423 kb      367.841 kb        0.368 kb     1000  (B) ParticleJetContainer_p1_Kt4TruthParticleJets
    4423.054 kb      380.327 kb        0.380 kb     1000  (B) ParticleJetContainer_p1_Cone7TruthParticleJets
    4608.956 kb      385.483 kb        0.385 kb     1000  (B) ParticleJetContainer_p1_Kt6TruthParticleJets
    2627.185 kb      592.927 kb        0.593 kb     1000  (B) ParticleJetContainer_p1_AtlfastParticleJetContainer
   19454.810 kb     7827.819 kb        7.828 kb     1000  (B) Rec::TrackParticleContainer_tlp1_AtlfastTrackParticles
   61318.476 kb    23965.933 kb       23.966 kb     1000  (B) McEventCollection_p3_GEN_AOD
================================================================================
  108959.667 kb    34389.283 kb       34.389 kb     1000  TOTAL (POOL containers)
================================================================================
## Bye.

Now we are ready to analyse this AOD and produce DPDs. This is the purpose of the next sections.

Producing DPDs from AODs

Some definitions

The current Event Data Model introduces a new data format, the Derived Physics Dataset (DPD) with the aim to reduce further the size of the analysis objects. Thus, a DPD is defined as a set of data which is a subset of ESD or AOD data content with the possible addition of analysis data, analysis data being defined as quantities derived from data in ESD or AOD.
Reducing the size of an event is done in three steps:


* Skimming: selecting only interesting events based on some event level quantity like e.g.:

  • number of electrons

  • Missing ET

The skimming can be implemented cutting directly on TAG files.


* Thinning: selecting only interesting objects from a container, like e.g.:

  • keep all electrons with pT > 20 GeV?


* Slimming: selecting only interesting properties of an object, like e.g.:

  • drop some of the calorimeter informations out of an electron

Some useful acronyms that you need to be familiar with:

* A D1PD is defined as a centrally produced DPD from an AOD or a set of AODs using the working group DPD tool maker, like e.g. for the top working group: TopPhysDPDMaker

* A D2PD is defined as a privately made, or customised DPD produced from a D1PD or an AOD

* A D3PD is defined as an ntuple made from a D1PD or a D2PD or an AOD. This is somehow similar to the EventView?/TopView approach.

This tutorial will focus mainly on the production of D1PD and D2PD. D3PDs are not considered here.

Finally if you want to follow the evolution of the current scheme, you need to keep an eye on the following packages:

  • PhysicsAnalysis/AthenaROOTAccess

  • PhysicsAnalysis/AthenaROOTAccessExamples

  • PhysicsAnalysis/DPDUtils

and watch these two hypernews:

Finally, if you're interested by the recommendations of the Analysis Model Report you can view it as a (PDF):

Example 1: Produce a DPD from an AOD

Previously we have seen how to produce an Atlfast AOD, we would like now to produce a DPD from it. There is not much to know to do this. You simply need to define an ATHENA POOL output stream AthenaPoolOutputStream that you can call StreamDPD and tell which items it should contain by adding them to the AthenaPoolOutputStream::ItemList.

The script looks like:

#------------------------------------------------------------------------
# import the data types
import EventKernel.ParticleDataType

#------------------------------------------------------------------------
# get a handle on the ServiceManager which holds all the services
from AthenaCommon.AppMgr import ServiceMgr

#------------------------------------------------------------------------
# Particle Properties
from PartPropSvc.PartPropSvcConf import PartPropSvc

#------------------------------------------------------------------------
# the Converters
import AthenaPoolCnvSvc.ReadAthenaPool

include ( "ParticleBuilderOptions/ESD_PoolCnv_jobOptions.py" )
include ( "ParticleBuilderOptions/AOD_PoolCnv_jobOptions.py" )
include ( "ParticleBuilderOptions/McAOD_PoolCnv_jobOptions.py" )
include ( "EventAthenaPool/EventAthenaPool_joboptions.py" )

#------------------------------------------------------------------------
# our DPD contents will be defined here below
from AthenaPoolCnvSvc.WriteAthenaPool import AthenaPoolOutputStream
StreamDPD = AthenaPoolOutputStream( "StreamDPD" )

# We want first to store everything which is an EventInfo object
StreamDPD.ItemList  =  ['EventInfo#*']

# We want to store the muons smeared by the Atlfast simulation
StreamDPD.ItemList += ['Analysis::MuonContainer#AtlfastMuonCollection']

#- we want to keep electrons as well
StreamDPD.ItemList += ["ElectronContainer#AtlfastElectronCollection"]

#- and of course the jets (Atlfast jets are by default using a Cone  algorithm with R=0.4
StreamDPD.ItemList += ['ParticleJetContainer#AtlfastParticleJetContainer']

#------------------------------------------------------------------------
StreamDPD.ForceRead= TRUE
StreamDPD.OutputFile= 'DPD.pool.root'

#------------------------------------------------------------------------
ServiceMgr.MessageSvc = Service( "MessageSvc" )
ServiceMgr.MessageSvc.OutputLevel = INFO
ServiceMgr.MessageSvc.Format = "% F%75W%S%7W%R%T %0W%M"
ServiceMgr.MessageSvc.defaultLimit=10000000

#------------------------------------------------------------------------
ServiceMgr.EventSelector.InputCollections = ['rfio:cchpssatlas.in2p3.fr://hpss/in2p3.fr/home/g/ghodban/CCIN2P3-27052008/AOD/mc13.005200.10TeV.atlfast.AOD._00200.v13004002.pool.root']

#------------------------------------------------------------------------
# Number of Events to process
theApp.EvtMax = -1

Test: copy this script and execute it to produce a DPD from the Atlfast AOD


# if not already created (or another directory where you've got enough disk space)
cd $GROUP_DIR/$LOGNAME
cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/Example1_jobOptions.py .
athena Example1_jobOptions.py &>example1.log

Example 2: Produce a DPD with a filter from an AOD

In this example, we want to do exactly the same thing as in the previous section, but this time, we would like to write in the DPD only events which fulfill the following conditions:

* at least one lepton,

* at least four jets,

* a missing transverse energy above 20 GeV?.

The implementation of this filter is straightforward as shown below. We define an algorithm called myFilter which inherits from PyAlgorithm. This algorithm has three methods initialize called once at the beginning, finalize called at the end and execute called at each event. In the execute method, we retrieve from StoreGate the electron, muon and jet containers as well as the missing transverse energy using the PyParticleTools methods. Once these containers are loaded, we can either loop over the particles or simply, as done in this example, count the number of particles.

from gaudimodule import PyAlgorithm
import PyParticleTools.PyParticleTools as PyParticleTools
import PyAnalysisCore.PyEventTools as PyEventTools

#---------------------------------------------------------------------------
class myFilter( PyAlgorithm ):
#---------------------------------------------------------------------------
    def __init__ ( self, name ) :
        PyAlgorithm.__init__(self,name)
#---------------------------------------------------------------------------
    def initialize(self):
        print "Initializing myFilter"
        return True
#---------------------------------------------------------------------------
    def finalize(self):
        return True            
#---------------------------------------------------------------------------
    def execute(self):
        # we assume this is a good event
        self.setFilterPassed(True)
        # retrieve the lepton containers
        ElectronContainer  = PyParticleTools.getElectrons("AtlfastElectronCollection")
        MuonContainer      = PyParticleTools.getMuons("AtlfastMuonCollection")
        # require at least one lepton
        if ElectronContainer.size() + MuonContainer.size() < 1:
            self.setFilterPassed(False)

        # retrieve jets
        ParticleJetContainer = PyParticleTools.getParticleJets("AtlfastParticleJetContainer")
        # require at least four jets
        if ParticleJetContainer.size() < 4:
            self.setFilterPassed(False)
        # retrieve the missing ET and require it to be above 20 GeV
        MissingET        = PyParticleTools.getMissingET("AtlfastMissingEt")
        et = MissingET.et()
        if et < 20000:
            self.setFilterPassed(False)
        if self.filterPassed():
            print "Filter passed: a good event!"
        else:
            print "Filter failed: a bad event!"
        return True

Of course this filter can be extended to select particles in a given #eta range and with a pT threshold. To know all the methods that one can use for electrons, muons and Jets, have a look at these ATHENA classes:

* Analysis::Electron

* Analysis::Muon

* ParticleJet

This filter is then attached to the StreamDPD by adding to the Example1_jobOptions.py script the following lines:

include( "PyAnalysisCore/InitPyAnalysisCore.py")
include("Example2_filter.py")

myFilter= myFilter("myFilter")
theApp.TopAlg += ["myFilter"]
StreamDPD.AcceptAlgs=["myFilter"]
We first initialize the Python based analysis Code software (required, or you will not be able to access the containers via StoreGate?). Then, we include our filter that we call Example2_filter.py. we instanciate a filter algorithm that we call myFilter, we don't forget to add it to the Algorithm sequence theApp.TopAlg and then, we attach this algorithm to the StreamDPD AthenaPoolOutputStream? which will be executed only if the myFilter is passed ( myFilter::filterPassed() )

Test: copy the Example2_jobOptions.py script and execute it to produce a DPD from the Atlfast AOD

cd $GROUP_DIR/$LOGNAME
cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/Example2_filter.py .
cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/Example2_jobOptions.py .
athena Example2_jobOptions.py &>example2.log

At this level, you might want to improve the above filter. If you're not familiar with the methods used to access particle properties, you can have a look at the self explanatory examples given in the PhysicsAnalysis/DPDUtils.

Producing a private D2PD from this D2PD

From the DPD.pool.root that you just produced, produce a new DPD using Example1_jobOptions.py. Don't forget to set ServiceMgr.EventSelector.InputCollections and StreamDPD.OutputFile accordingly

Producing a D1PD using the TopPhysDPDMaker?

Brief description of TopPhysDPDMaker?

A set of tools are currently being developed for DPD creation. Most for not saying all of them, are inspired from the various examples that one can find in the PhysicsAnalysis/DPDUtils package.
For the case of the Top Working Group, a D1PD, D2PD and D3PD tool was recently released and supported from releases 13.0.40 upwards: TopPhysDPDMaker before going further, we look at what this tool does and how to run it.
Let's first compile it:
cd $HOME/athena/13.0.40
cmt co -r TopPhysDPDMaker-00-00-11 PhysicsAnalysis/TopPhys/TopPhysDPDMaker
cd PhysicsAnalysis/TopPhys/TopPhysDPDMaker/*/cmt
source setup.sh
make

Let's look at its structure just to understand what is behind it.

ls $HOME/athena/13.0.40/PhysicsAnalysis/TopPhys/TopPhysDPDMaker/*/share
 
CVS                         D1PDSlimming_jobOptions.py     ElectroweakD2PD_topOptions.py  MetaData_jobOptions.py    TopPhysDPDSubmission
D1PDItemList_jobOptions.py  ElectroweakD1PD_topOptions.py  ElectroweakD3PD_topOptions.py  TopDPD_ProdOptionsAOD.py  TriggerObjectItemList_jobOptions.py

The different files in the TopPhysDPDMaker/share directory are:

* TopPhysDPDSubmission: a script for GRID submission using PATHENA

* TopDPD_ProdOptionsAOD.py: a D1PD production job option

* ElectroweakD1PD_topOptions.py: the D1PD production script. Let's edit it to get familiar with it. First, you can see this line:

from TopPhysDPDMaker.inclusive_lepFilterAlgorithm import *
We importe inclusive_lepFilterAlgorithm defined in the file python/inclusive_lepFilterAlgorithm.py and which is inspired from what one can find in the DPDUtils/share/semilep_ttbarFilterAlgorithm.py.
Looking at this inclusive_lepFilterAlgorithm.py file, we see that the inclusive lepton filter is passed if the event contains at least one lepton electron or muon within eta and pT range (both collections MuidMuonCollection? AND StacoMuonCollection? are considered on the same feet).

Next, we see in the ElectroweakD1PD_topOptions.py, this line:
from TopPhysDPDMaker.slimJets import *
Again, here we import the python/slimjets.py script (algorithm) with which the jet collections: Cone4H1TowerParticleJets, Cone4H1TopoParticleJets and Kt6H1TowerParticleJets are slimmed using ParticleJet::removeAll() and ParticleJet::removeInfo()methods.

Then, we call the slimTracks algorithm (implemented in python/slimTracks.py):
from TopPhysDPDMaker.slimTracks import *
to slim all tracks from the TrackParticleCandidate? container with pT < 5 GeV?.

Then, we see the creation of the DPD with the constraint that the inclusive_lepFilter is fulfilled.
StreamDPD = AthenaPoolOutputStream( "StreamDPD" )
StreamDPD.AcceptAlgs=["inclusive_lepFilter"]

The D1PD item list is defined by including:

include("TopPhysDPDMaker/D1PDItemList_jobOptions.py")
Let's have a look at it and see what is stored in the produced D1PD:
# Items for primary DPD
StreamDPD.ItemList =  ['EventInfo#*']
StreamDPD.ItemList += ['Rec::TrackParticleContainer#TrackParticleCandidate']
StreamDPD.ItemList += ['VxContainer#VxPrimaryCandidate']
StreamDPD.ItemList += ['ParticleJetContainer#Kt6H1TowerParticleJets']
StreamDPD.ItemList += ['ParticleJetContainer#Cone4H1TowerParticleJets']
StreamDPD.ItemList += ['ParticleJetContainer#Cone4H1TopoParticleJets']
StreamDPD.ItemList += ['egammaContainer#ElectronAODCollection']
StreamDPD.ItemList += ['egammaContainer#PhotonAODCollection']
StreamDPD.ItemList += ['egDetailContainer#egDetailAOD']
StreamDPD.ItemList += ['Analysis::TauJetContainer#*']
StreamDPD.ItemList += ['Analysis::TauDetailsContainer#*']
StreamDPD.ItemList += ['Rec::TrackParticleContainer#MuonboyMuonSpectroOnlyTrackParticles']
StreamDPD.ItemList += ['Rec::TrackParticleContainer#StacoTrackParticles']
StreamDPD.ItemList += ['MissingET#MET_RefFinal']
StreamDPD.ItemList += ['Analysis::MuonContainer#StacoMuonCollection']
StreamDPD.ItemList += ['Analysis::MuonContainer#MuidMuonCollection']
StreamDPD.ItemList += ['Analysis::MuonContainer#CaloMuonCollection']
StreamDPD.ItemList += ['Rec::MuonSpShowerContainer#*']


  • ElectroweakD2PD_topOptions.py: job script for producing D2PD, so far is only a copy of ElectroweakD1PD_topOptions.py and this is script that we will use to produce our D2PD

  • ElectroweakD3PD_topOptions.py: job script for producing D3PD (ntuples a la Topview) with analysis objects (not considered here!).

In order to produce a D3PD indeed an EventView?/TopView ntuple, you need to check out these packages:
cmt co -r TopPhysTools-13-00-40-07  PhysicsAnalysis/TopPhys/TopPhysTools
cmt co -r EventViewUserData-00-01-18-04  PhysicsAnalysis/EventViewBuilder/EventViewUserData
cmt co -r HighPtView-00-01-11 PhysicsAnalysis/HighPtPhys/HighPtView
cmt co -r HEAD PhysicsAnalysis/TopPhys/TopPhysUtils/HitFit
and follow the explanations given on this Twiki

Producing a D2PD with TopPhysDPDMaker?

To create a D2PD using the TopPhysDPDMaker? tool, you need to use the share/ElectronweakD2PD_topOptions.py script. To do this, create a job option file with the following commands:

# AOD that you want to run other
InputCollections = ['rfio:cchpssatlas.in2p3.fr://hpss/in2p3.fr/home/g/ghodban/CCIN2P3-27052008/AOD/trig1_misal1_mc12.005200.T1_McAtNlo_Jimmy.recon.AOD.v13003004_tid018416._00002.pool.root.1']

# maximum number of events to process (this is not the number of events to be written to the DPD!)
EvtMax=10

include("TopPhysDPDMaker/ElectroweakD2PD_topOptions.py")

You can copy this script from the tutorial repository:

# go the TopPhysDPDMaker test area
cd  $HOME/athena/13.0.40/PhysicsAnalysis/TopPhys/TopPhysDPDMaker/*/test

# copy this script
cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/Example_TopPhysDPDMaker_D2PD.py .

# run it
athena Example_TopPhysDPDMaker_D2PD.py &> D2PD.log

You can of course add things that you think are missing in your D2PD. This is straightforward: in the above script, Example_TopPhysDPDMaker_D2PD.py , add at the end to the streamDPD the list of items which you need.
For instance, as you can see from the D1PDItem_list.py file, the calibrated topological clusters using Local Hadron Calibration are missing and thus, you cannot rerun a jet algorithm like e.g. the kT algorithm.
In order to solve this issue, simply add the container to the item list at the end of your script.

InputCollections = ['rfio:cchpssatlas.in2p3.fr://hpss/in2p3.fr/home/g/ghodban/CCIN2P3-27052008/AOD/trig1_misal1_mc12.005200.T1_McAtNlo_Jimmy.recon.AOD.v13003004_tid018416._00002.pool.root.1']
EvtMax=10

include("TopPhysDPDMaker/ElectroweakD2PD_topOptions.py")

# add the calibrated topo-clusters (to be able to re-run jet algorithms ;-)
StreamDPD.ItemList += ['CaloClusterContainer#CaloCalTopoCluster']

Rerun you Example_TopPhysDPDMaker_D2PD.py:

athena Example_TopPhysDPDMaker_D2PD.py &> D2PD.log
You can then use the checkFile.py utility to check that the CaloCalTopoCluster collection is now in your D2PD:
checkFile.py  Electroweak.D2PD.pool.root  &>D2PD.contents

Look at your D2PD.contents file and you will understand why in order to reach a 10% AOD size for the D1PD, this collection was dropped:

## opening file [Electroweak.D2PD.pool.root]...
## importing ROOT...
## importing ROOT... [DONE]
## opening file [OK]
File:Electroweak.D2PD.pool.root
Size:     1139.470 kb
Nbr Events: 10

================================================================================
     Mem Size       Disk Size        Size/Evt      items  (X) Container Name (X=Tree|Branch)
================================================================================
     145.210 kb        0.000 kb        0.000 kb       10  (T) DataHeader
--------------------------------------------------------------------------------
     106.726 kb        0.000 kb        0.000 kb       10  (B) ElectronContainer_p1_ElectronAODCollection
      63.666 kb        0.000 kb        0.000 kb       10  (B) PhotonContainer_p1_PhotonAODCollection
      53.986 kb        0.000 kb        0.000 kb       10  (B) TauJetContainer_p1_AtlfastTauJet1p3pContainer
      50.448 kb        0.000 kb        0.000 kb       10  (B) TauJetContainer_p1_AtlfastTauJetContainer
      73.368 kb        0.000 kb        0.000 kb       10  (B) TauJetContainer_p1_Tau1P3PContainer
      79.288 kb        0.000 kb        0.000 kb       10  (B) TauJetContainer_p1_TauRecContainer
       8.779 kb        0.000 kb        0.000 kb       10  (B) MissingET_p1_MET_RefFinal
     312.497 kb        0.000 kb        0.000 kb       10  (B) TauDetailsContainer_tlp1_Tau1P3PDetailsContainer
     311.751 kb        0.000 kb        0.000 kb       10  (B) TauDetailsContainer_tlp1_TauRecDetailsContainer
     105.940 kb        0.000 kb        0.000 kb       10  (B) MuonContainer_p1_CaloMuonCollection
     103.398 kb        0.000 kb        0.000 kb       10  (B) MuonContainer_p1_MuidMuonCollection
     105.516 kb        0.000 kb        0.000 kb       10  (B) MuonContainer_p1_StacoMuonCollection
      62.622 kb        0.000 kb        0.000 kb       10  (B) egDetailContainer_p1_egDetailAOD
       2.058 kb        0.000 kb        0.000 kb       10  (B) MuonSpShowerContainer_p1_MuonSpShowers
     231.486 kb        0.000 kb        0.000 kb       10  (B) Rec::TrackParticleContainer_tlp1_MuonboyMuonSpectroOnlyTrackParticles
     229.926 kb        0.000 kb        0.000 kb       10  (B) Rec::TrackParticleContainer_tlp1_StacoTrackParticles
      25.933 kb        0.000 kb        0.000 kb       10  (T) Trk::MVFVxContainer_tlp1
     361.583 kb        0.000 kb        0.000 kb       40  (T) POOLContainer_TauDetailsContainer_tlp1
      88.916 kb        0.000 kb        0.000 kb       40  (T) POOLContainer_TauJetContainer_p1
      78.177 kb        1.058 kb        0.106 kb       10  (B) EventInfo_p2_McEventInfo
     118.611 kb        6.828 kb        0.683 kb       10  (B) ParticleJetContainer_p1_Cone4H1TopoParticleJets
     113.652 kb       15.072 kb        1.507 kb       10  (B) ParticleJetContainer_p1_Cone4H1TowerParticleJets
     161.068 kb       25.350 kb        2.535 kb       10  (B) ParticleJetContainer_p1_Kt6H1TowerParticleJets
     487.097 kb       28.580 kb        2.858 kb       10  (B) Rec::TrackParticleContainer_tlp1_TrackParticleCandidate
     334.299 kb       56.401 kb        5.640 kb       10  (B) Trk::VxContainer_tlp1_VxPrimaryCandidate
    1026.691 kb      205.808 kb       20.581 kb       10  (B) CaloClusterContainer_p2_CaloCalTopoCluster
================================================================================
    4842.692 kb      339.097 kb       33.910 kb       10  TOTAL (POOL containers)
================================================================================
## Bye.

Producing a D3PD with TopPhysDPDMaker?

TopPhysDPDMaker? allows to dumps flat ntuples or if we stick to the new terminology a D3PD. This part is not considered in this tutorial. At the same time, the making of the D3PD from the AOD presents several advantages like e.g. the trigger information for which you don't need to bother.

Running your own algorithm and dump a D2PD (with e.g. the di-jet mass) with ATHENA and with ROOT

In this section, we show how to create an algorithm, run it as an ATHENA algorithm and then, with AthenaROOTAccess?, repeat the same thing in ROOT.
To illustrate this, we will write a small algorithm, which given a jet container, selects jets with given pT and eta values and finds the two closest jets in delta R space.

Create your algorithm

The creation of an ATHENA algorithm is described in more details in this Twiki. Simply follow the few command lines below:

cd $HOME/athena/13.0.40

# create a new package called MyNewPackage
cmt create MyNewPackage MyNewPackage-00-00-01

# the following directory will contain our 
mkdir $HOME/athena/13.0.40/MyNewPackage/*/share/

mkdir $HOME/athena/13.0.40/MyNewPackage/*/run/

mkdir -p $HOME/athena/13.0.40/MyNewPackage/*/src/components

cd $HOME/athena/13.0.40/MyNewPackage/*/cmt/

# copy the requirement file containing list of required packages
cp -f /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/MyNewPackage/requirements .

cd $HOME/athena/13.0.40/MyNewPackage/*/src/

# copy the algorithm MyAlg.cxx
cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/MyNewPackage/MyAlg.cxx .

# copy this class called WBosonBuilder
cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/MyNewPackage/WBosonBuilder.cxx .

cd $HOME/athena/13.0.40/MyNewPackage/*/src/components
cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/MyNewPackage/MyNewPackage_entries.cxx .
cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/MyNewPackage/MyNewPackage_load.cxx .

# copy all header files
cd $HOME/athena/13.0.40/MyNewPackage/*/MyNewPackage
cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/MyNewPackage/MyAlg.h .
cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/MyNewPackage/WBosonBuilder.h .

# in order to be able to have our software run with ROOT as well, we need to have this file containing all header files which will be used for the LCG dictionary generation!
cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/MyNewPackage/MyNewPackageDict.h .

# ask CMT for configuration and script generation
cd $HOME/athena/13.0.40/MyNewPackage/*/cmt/
cmt config
source setup.sh
make

While the algorithm is compiling, in the meanwhile, let's look at our requirements file and see what it contains such that the WBosonBuilder object is callable from ROOT!

      1 package MyNewPackage
      2 
      3 use AtlasPolicy                  AtlasPolicy-01-*
      4 use GaudiInterface           GaudiInterface-01-*         External
      5 use JetTagEvent                 JetTagEvent-*                   PhysicsAnalysis/JetTagging       ---> to access the ParticleJet and ParticleJetContainer classes
      6 use FourMom                    FourMom-*                       Event                                           ---> to access P4Help class
      7 use FourMomUtils             FourMomUtils-*                Event
      8 
      9 library MyNewPackage *.cxx -s=components *.cxx
     10 apply_pattern component_library
     11 
     12 apply_pattern declare_joboptions files="MyJobOptions.py"                                       ---> Name of our python job-option file
     13 
     14 # In order to be able to load all this in a ROOT session
     15 private
     16 use AtlasReflex   AtlasReflex-00-*   External -no_auto_imports                                   --->  Load genreflex for LCG dictionary  (lcgdict) file for each header file
     17 apply_pattern lcgdict dict=MyNewPackage selectionfile=selection.xml headerfiles="../MyNewPackage/MyNewPackageDict.h"   --> here the list of files for which we want to have the LCG dictionnaries
     18 end_private

genreflex/lcgdict is the tool for LCG dictionnary generation, it is interfaced to Atlas through the AtlasReflex? package. In the MyNewPackageDict.h, we put the header files for the objects we want to include in the MyNewPackage.so and MyNewPackageDict.so which will be loaded in ROOT (to know more: LCGDictionary.
The selection.xml is the class selection file used to specify for which classes the dictionaries must be generated.

If compilation is still not finished, you can edit the source files for MyAlg and the WBosonBuilder classes which select jets with given pT and eta cuts and see how they are written.

If compilation ran smoothly, we can then see how to run this algorithm with ATHENA and then see how we can use the WBosonBuilder with ROOT!

Run your algorithm with ATHENA

We want to run over one Atlfast AOD the MyAlg which is an interface to the WBosonBuilder. This is done through the MyJobOptions.py in which we instanciate the MyAlg. Let's copy it and look at it:

cd $HOME/athena/13.0.40/MyNewPackage/*/share/
cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/MyJobOptions.py .
The MyJobOptions.py file looks like:
      1 
      2 #------------------------------------------------------------------------
      3 # import the data types
      4 import EventKernel.ParticleDataType
      5 
      6 #------------------------------------------------------------------------
      7 # get a handle on the ServiceManager which holds all the services
      8 from AthenaCommon.AppMgr import ServiceMgr
      9 
     10 # the converters
     11 import AthenaPoolCnvSvc.ReadAthenaPool
     12 include ( "ParticleBuilderOptions/ESD_PoolCnv_jobOptions.py" )
     13 include ( "ParticleBuilderOptions/AOD_PoolCnv_jobOptions.py" )
     14 include ( "ParticleBuilderOptions/McAOD_PoolCnv_jobOptions.py" )
     15 include ( "EventAthenaPool/EventAthenaPool_joboptions.py" )
     16 
     17 #------------------------------------------------------------------------
     18 # give the library
     19 theApp.Dlls   += ["MyNewPackage"]
     20 theApp.Dlls   += ["MyNewPackageDict"]
     21 
     22 #------------------------------------------------------------------------
     23 # Full job is a list of algorithms
     24 from AthenaCommon.AlgSequence import AlgSequence
     25 TopSeq = AlgSequence()
     26 
     27 # Add top algorithms to be run
     28 from MyNewPackage.MyNewPackageConf import MyAlg
     29 WBoson = MyAlg("WBoson")
     30 WBoson.OutputLevel = INFO
     31 WBoson.InputParticleJetContainerName = 'AtlfastParticleJetContainer'
     32 WBoson.OutputHadronicWBosonContainerName='myWBoson'
     33 WBoson.CutJetPt  = 30 * GeV
     34 WBoson.CutJetEta = 2.5
     35 #------------------------------------------------------------------------
     36 # add this algorithm to the algorithm sequence
     37 TopSeq += [WBoson]
     38 
     39 #------------------------------------------------------------------------
     40 # Set output level threshold (2=DEBUG, 3=INFO, 4=WARNING, 5=ERROR, 6=FATAL )
     41 from AthenaCommon.AppMgr import ServiceMgr
     42 ServiceMgr.MessageSvc = Service( "MessageSvc" )
     43 ServiceMgr.MessageSvc.OutputLevel = DEBUG
     44 ServiceMgr.MessageSvc.Format = "% F%75W%S%7W%R%T %0W%M"
     45 ServiceMgr.MessageSvc.defaultLimit=10000000
     46 
     47 #------------------------------------------------------------------------
     48 ServiceMgr.EventSelector.InputCollections = ['AOD.pool.root'];
     49 
     50 #------------------------------------------------------------------------
     51 # Number of Events to process
     52 theApp.EvtMax = -1

At line 29, we instanciate the algorithm WBoson = MyAlg?("WBoson"), and then we set its properties: message level, input jet container name, container name for the hadronic W-bosons which will be reconstructed and which will be stored in StoreGate. We set the two values for the pT and eta cuts for the jet skimming.

     29 WBoson = MyAlg("WBoson")
     30 WBoson.OutputLevel = INFO
     31 WBoson.InputParticleJetContainerName = 'AtlfastParticleJetContainer'
     32 WBoson.OutputHadronicWBosonContainerName='myWBoson'
     33 WBoson.CutJetPt  = 30 * GeV
     34 WBoson.CutJetEta = 2.5

Let's run it to check that it works fine.

cd $HOME/athena/13.0.40/MyNewPackage/*/share/
cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/MyNewPackage/MyJobOptions.py .
cd $HOME/athena/13.0.40/MyNewPackage/*/run/

# run it
athena  ../share/MyJobOptions.py &> athena.log
You can have a look at the athena.log file to see that your algorithm runs nicely and reconstructs a jet from the two closest jets in the eta,phi space.

Now, we would like to store this object in a DPD and then view the mass distribution with ROOT. This is explained in the next section.

Run your algorithm with ATHENA and produce a DPD which contains the di-jet invariant mass

Test: Now, we would like to rerun the algorithm above, but this time, we would like to store in a DPD the reconstructed hadronic W boson candidates!

Solution: This is rather easy, we simply need to add to the previous job option file the Athena Pool Output Stream as we did previously and add our container to the ItemList:

from AthenaPoolCnvSvc.WriteAthenaPool import AthenaPoolOutputStream
StreamDPD = AthenaPoolOutputStream( "StreamDPD" )

# everything which is an EventInfo object
StreamDPD.ItemList  =  ['EventInfo#*']

#- my di-Jet mass
StreamDPD.ItemList += ['ParticleJetContainer#myWBoson']

StreamDPD.ForceRead= TRUE
StreamDPD.OutputFile= 'DPD.pool.root'

Again, you can check that the produced DPD contains the myWBoson container using checkFile.py
Test: Store in the DPD two dijet masses, the first one called diJetMass40 for a jet pT >40 GeV? and the second one for a jet pT>30 GeV? and labeled diJetMass30

Viewing the di-jet invariant mass with ROOT

In this section, we show how, by using the AthenaROOTAccess, we can view the distribution for the di-jet mass.
Let's first install AthenaROOTAccess?. For release 13.0.40, we will use the recommended tag AthenaROOTAccess-00-00-38-09

cd $HOME/athena/13.0.40
cmt co -r AthenaROOTAccess-00-00-38-09  PhysicsAnalysis/AthenaROOTAccess
cd $HOME/athena/13.0.40/PhysicsAnalysis/AthenaROOTAccess/*/cmt
source setup.sh
make

Once compilation is finished, we will within a ROOT session, create an histogram and read the transient TTree which is created by executing a python script provided with AthenaROOTAccess-00-00-38-09 and called test.py. Let's get it:

cd $HOME/athena/13.0.40/MyNewPackage/*/run/
get_files test.py

You need then to fix the name of the DPD file in this python script. Let's do it:

mv test.py loadDPD.py
sed -i 's/AOD.pool.root/DPD.pool.root/' loadDPD.py
Let's then load the script in ROOT

// start ROOT
root -l

// load the DPD using the TPython module
// if you want to do this with something written in C++, you would need to rewrite the AthenaROOTAccess/python/transientTree.py in C++
TPython::ExecScript("loadDPD.py")

// list the current directory contents: you should see the CollectionTree_trans
gDirectory->ls();

// get the Transient TTree which name is CollectionTree_trans (can be changed through the python script above)
TTree *t = (TTree*)gDirectory->Get("CollectionTree_trans");

// Now let's print the TTree: you should see themyWBoson  container!
t->Print();

// let's get the TBranch
TBranch *bPJC =  (TBranch*)t->GetBranch("myWBoson");

// create a ParticleJetContainer variable
ParticleJetContainer *PJC = new ParticleJetContainer;

// attach the ParticleJetContainer to it
bPJC->SetAddress(&PJC);

// let's get one entry
bPJC->GetEntry(0);

// get the reconstructed particle built from the two closest jets in eta, phi
ParticleJet *PJ  = (*PJC)[0];

// Print  the mass for the reconstructed invariant di-jet mass
std::cout << PJ->m() << std::endl;

Exercice: create an histogram h1_mass, fill it with the reconstructed mass and draw it!
Solution:

// start ROOT
root -l

// load the DPD using the TPython module
// if you want to do this with something written in C++, you would need to rewrite the AthenaROOTAccess/python/transientTree.py in C++
TPython::ExecScript("loadDPD.py")

// list the current directory contents: you should see the CollectionTree_trans
gDirectory->ls();

// get the Transient TTree which name is CollectionTree_trans (can be changed through the python script above)
TTree *t = (TTree*)gDirectory->Get("CollectionTree_trans");

// Now let's print the TTree: you should see the myWBoson  container!
t->Print();

// let's get the TBranch
TBranch *bPJC =  (TBranch*)t->GetBranch("myWBoson");

// create a ParticleJetContainer variable
ParticleJetContainer *PJC = new ParticleJetContainer;

// attach the ParticleJetContainer to it
bPJC->SetAddress(&PJC);

// let's print the total number of entries
std::cout << t->GetEntriesFast() << std::endl;

// create the TH1F histogram
TH1F *h1_jj_mass =  new TH1F("h1_jj_mass","",100,0,500);

for (int event=0; event <  t->GetEntriesFast(); event++){ bPJC->GetEntry(event); if (PJC->size()>0) h1_jj_mass->Fill( ((*PJC)[0])->m()/1000);}

// Draw the distribution
h1_jj_mass->Draw();

Run your algorithm with ROOT

In this section, we would like to show how the class WBosonBuilder can be used within a ROOT session to analyse an AOD or a D1PD. As we have show it before, adding in the requirements file these lines:
     15 private
     16 use AtlasReflex   AtlasReflex-00-*   External -no_auto_imports                                   --->  Load genreflex for LCG dictionary file for each header file
     17 apply_pattern lcgdict dict=MyNewPackage selectionfile=selection.xml headerfiles="../MyNewPackage/MyNewPackageDict.h"   --> here the list of files for which we want to have the LCG dictionnaries
     18 end_private
the LCG dictionnary for all header files defined in selection.xml will be produced (genreflex and thus these objects will be known to ROOT. Then, In order to access the WBosonBuilder object in ROOT, one needs simply to load the MyNewPackage.so and MyNewPackageDict.so shared objects.
Now, let's redo it with ROOT. But first let's modify the test.py file to load the AOD:
mv test.py loadAOD.py

Edit loadAOD.py and change AOD.pool.root to rfio:cchpssatlas.in2p3.fr://hpss/in2p3.fr/home/g/ghodban/CCIN2P3-27052008/AOD/mc13.005200.10TeV.atlfast.AOD._00200.v13004002.pool.root

Now let's run on the AOD with the WBosonBuilder:


root -l
// Load first the AOD and extract the transient CollectionTree
TPython::ExecScript("loadAOD.py");

//  Get the transient TTree
TTree *t = (TTree*)gDirectory->Get("CollectionTree_trans");

// Load the shared object and its dictionnary to get access to WBosonBuilder
// don't worry about paths since the LD_LIBRARY_PATH includes the $HOME/athena/13.0.40/InstallArea/lib (check it!)

gSystem->Load("libMyNewPackage.so");
gSystem->Load("libMyNewPackageDict.so");

// Create a WBosonBuilder object!
WBosonBuilder *wBuilder = new WBosonBuilder();

// let's get the TBranch
TBranch *bPJC =  (TBranch*)t->GetBranch("AtlfastParticleJetContainer");

// create a ParticleJetContainer variable
ParticleJetContainer *PJC = new ParticleJetContainer;

// attach the ParticleJetContainer to it
bPJC->SetAddress(&PJC);

// get the first event
bPJC->GetEntry(0);

// how many jets do we have in this first event
std::cout << PJC->size() << std::endl;

// use the WBosonBuilder to  select all jets which have a pT>30GeV and |eta|<3.0
ParticleJetContainer* sPJC = wBuilder->doPreselection(PJC,30000,3.0)

// print the number of selected jets!
std::cout << sPJC->size() << std::endl;

// let's now use the WBosonBuilder::build method to reconstruct the two jets the closest!
ParticleJetContainer* WC = wBoson->build(sJC);

// print the reconstructed invariant mass!
ParticleJet *W = (*WC)[0];
std::cout << W->m() << std::endl;

Now you know how, with ROOT, to re-use the same software!

Exercice: starting from what we did above, write a small ROOT macro which loops over all entries and displays the reconstructed invariant di-jet mass!

Solution: you can find the solution at this location


cd $HOME/athena/13.0.40/MyNewPackage/*/run/

# copy the macro
cp  /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/diJetMass.C .

# copy the python script for loading the AOD
cp  /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/loadAOD.py .

Then, you can run this macro:

root -l

gROOT->LoadMacro("diJetMass.C")

// call the diJetMass macro
diJetMass()

Exercice: Analysing FDR AODs with AthenaROOTAccess?

We have shown how to run on an AOD or a DPD with AthenaROOTAccess?. In this section, we show how to run on the FDR08 stream Muon and produce the di-muon invariant mass spectrum.

Exercice:: from the fdr08_run1.0003070.StreamMuon.merge.AOD.o1_r12_t1._0001.1:

  • write a muon filter which requires the event to have at least two StacoMuonCollection muons with pT>10 GeV? and |eta| <3.

  • write a small job option file which creates a DPD with the previous filter!

Solution: you can find the two scripts in this directory:

# the muon filter
cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/muon_filter.py .

cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/fdr_AODtoDPD.py .

Now that we have our DPD, let's analyse it with ROOT and find events with two muons of opposite charge only.

Exercice: write a ROOT macro which given the transient TTree produced with the AthenaROOTAccess? test.py script,

* loops over all entries

* keeps events with only two muons of opposite charge

* plots the invariant mass of these two muons

Solution: copy the macro from this directory:

cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/Z0toMuMuAnalysis.C .
cp /afs/in2p3.fr/home/g/ghodban/public/CCIN2P3-27052008/loadFDR.py .

Analyse the DPD (or the AOD!):

root -l

// load the Macro
gROOT->LoadMacro("Z0toMuMuAnalysis.C");

// get the TTree
TTree *t = LoadTree();

// produce the di-muon invariant mass
Z0toMuMuAnalysis(t);

Tip: You can at this level, write a class like WBosonBuilder and call it in ROOT and compare timing!

Conclusions

With all this tips, you should now be able to run on the coming data!

Some words on what not covered in this tutorial:

  • GRID tools: the explanations given on the ATLAS twiki are self explanatory enough, such that you can run

  • More sophisticated tools for Top Physics: You can have a look at what is being done in the ARATopQuarkAnalysis package. We are currently working on an interface with the analysis package developed at LPSC (Grenoble).

FAQ











<-- do not change: this is for statistics usage only! -->
<-- do not change: this is for statistics usage only! -->
<-- do not change: this is for statistics usage only! -->
<-- do not change: this is for statistics usage only! -->
<-- GoStats? JavaScript? Based Code -->
<-- End GoStats? JavaScript? Based Code -->
-

-- ghodban@IN2P3.FR - 26 May 2008

 
This site is powered by the TWiki collaboration platformCopyright © by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback