Difference: BelleComputing (23 vs. 24)

Revision 242019-01-28 - rsobie

Line: 1 to 1

Submitting Belle2 jobs from KEK and UVIC

Login into KEK (login and computer systems) or the belleuser VM on beaver

ssh sshcc1.kek.jp
ssh login.cc.kek.jp

Set up a new release (3.0.0) - need KEK ssh passphrase and DESY password

source /sw/belle2/tools/b2setup
b2analysis-create release-03-00-00 release-03-00-00
cd release-03-00-00

Copy analysis software (h and cc) from previous release directory (should use git) and compile


Running interactive jobs (3.0.0)

cd release-03-00-00
source /sw/belle2/tools/b2setup

Running grid jobs (3.0.0)

Initialization of Belle2 environment at KEK
source /sw/belle2/gbasf2/BelleDIRAC/gbasf2/tools/setup
gb2_proxy_init g belle

Copy shared libraries to gbasf2 directory (should test using full path name)

 cp /home/belle2/rsobie/release-03-00-00/modules/Linux_x86_64/opt/lib* .
(librelease-03-00-00.so librelease-03-00-00.b2modmap)

Add library path to python script

# configure the tau module
#mytau = b2.register_module('tausteer')
mytau = b2.register_module('tausteer', shared_lib_path='./librelease-03-00-00.so')

Submit a tau pair MC analysis job with one file

gbasf2 runTauMC.py -p jan2019 -s release-03-00-00 -f librelease-03-00-00.so librelease-03-00-00.b2modmap -i /belle/MC/release-01-00-03/DB00000294/MC10/prod00004783/s00/e1002/4S/r00000/taupair/mdst/sub00/mdst_000001_prod00004783_task10010000001.root

gb2_job_status -p jan2019

gb2_job_output -j <job#>

Ntuple in directory  /group/belle2/users/rsobie/taupair.root

Running interactive jobs


Running interactive jobs (older releases)

source /cvmfs/belle.cern.ch/sl6/tools/setup_belle2
setuprel release-00-07-02
cd working/basf2/
cp /cvmfs/belle.cern.ch/sl6/releases/release-00-07-02/reconstruction/examples/example.py .

Add the following lines into the example.py script:

from ROOT import Belle2
use_local_database(Belle2.FileSystem.findFile("data/framework/database.txt"), "", True, LogLevel.ERROR)

Run the MC generation example job from the command line

basf2 ./example-new.py
(note that the "new" version has extra lines for a database fix)

Run the Analysis example job (my_ana.py) from the command line (using MC7 data).

(belleuser)$ setuprel release-00-07-02  
(belleuser)$ basf2 ./my_ana.py -i ~/mdst_000113_prod00000223_task00000113.root

(kek)$ basf2 ./my_ana.py \
-i /ghi/fs01/belle2/bdata/MC/release-00-07-02/DBxxxxxxxx/MC7/prod00000223/s00/e0000/4S/r00000/signal/sub00/mdst_000113_prod00000223_task00000113.root

Run the Analysis example job using MC8 data (stored in Ceph at CERN) - note use of s3 rather than http

(belleuser)$ setuprel build-2017-05-06
(belleuser)$ basf2 ./my_ana.py -i s3://rjsBucket.cs3.cern.ch/mdst_000001_prod00001231_task00000001.root

Running grid jobs

Initialization of Belle2 environment

(belleuser)$ source /opt/gbasf2KEK/BelleDIRAC/gbasf2/tools/setup
(kek)$ source /sw/belle2/gbasf2/BelleDIRAC/gbasf2/tools/setup

gb2_proxy_init g belle

Run example job on the grid

cd working/gbasf
(belleuser)$ gbasf2 example.py -p myproject -s release-00-07-02 

Add option to run on UVIC cloud (--site=DIRAC.UVic.ca)

Locate the data samples

gb2_ds_list /belle/MC/release-00-07-02/DBxxxxxxxx/MC7/prod00000223/s00/e0000/4S/r00000/signal/sub00

Run the analysis example job on the grid (over 100 jobs are submitted, one per mdst file)

cd working/gbasf2
gbasf2 ./my_ana.py -p rjs1 -i /belle/MC/release-00-07-02/DBxxxxxxxx/MC7/prod00000223/s00/e0000/4S/r00000/signal/sub00
(same command on belleuser or kek)

Monitoring the jobs

gb2_job_status -p <ProjectName> --site DIRAC.UVic.ca 
gb2_site_summary --site DIRAC.UVic.ca --date 2016-04-10
gb2_job_status --site DIRAC.UVic.ca --status Waiting
gb2_job_status --site DIRAC.UVic.ca --status Running
This site is powered by the TWiki collaboration platform Powered by PerlCopyright © 2008-2019 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback