create new tag
view all tags
This page described the installation and upgrade procedures for a DIRAC server operating as a slave to the server at KEK for BelleII


  • The DIRAC Interweave website: http://diracgrid.org/
  • Email Miyake-san: hideki.miyake_DONT_SPAM_HIM_kek.jp


In the remainder of this document, the following formatting convention is used to differentiate terminal commands from file content

This background colour denotes terminal input

This background colour denotes file content

Server Controls

DIRAC empys the runit mechanism for server controls. These reside in /opt/dirac/pro/control. Specifically the controls for the Site Directors (submit pilot jobs) are part of the Workload Management system. So the site director for the UVicTest queue may for example be found in:


To restart the agent for example, do:

touch /opt/dirac/pro/control/WorkloadManagement/SiteDirectorUVicTest/stop_agent

  • Note: It would be nice to clear up why dirac does not function as a system service.

Server Installation

Before stating ensure that you are running python 2.6, as of right now DIRAC does not work with later version of python (older not tested). Follow the instructions for Additional Server Installation. Here is the installation configuration file install.cfg I used:

# This section determines which DIRAC components will be installed and where
  #   These are options for the installation of the DIRAC software
  #  DIRAC release version (this is an example, you should find out the current
  #  production release)
  Release = v6r10p23
  #  To install the Server version of DIRAC (the default is client)
  InstallType = server
  #  LCG python bindings for SEs and LFC. Specify this option only if your installation
  #  uses those services
  LcgVer = 2014-02-04
  #  If this flag is set to yes, each DIRAC update will be installed
  #  in a separate directory, not overriding the previous ones
  UseVersionsDir = yes
  #  The directory of the DIRAC software installation
  TargetPath = /opt/dirac
  #  DIRAC extra packages to be installed (Web is required if you are installing the Portal on
  #  this server).
  #  For each User Community their extra package might be necessary here:
  #   i.e. LHCb, LHCbWeb for LHCb
  ExtraModules = Belle

  #   These are options for the configuration of the previously installed DIRAC software
  #   i.e., to produce the initial dirac.cfg for the server
  #  Give a Name to your User Community, it does not need to be the same name as in EGI,
  #  it can be used to cover more than one VO in the grid sense
  # VirtualOrganization = belle
  #  Site name
  SiteName = bellecs.heprc.uvic.ca
  #  Setup name
  Setup = Belle-KEK
  #  Default name of system instances
  InstanceName = Production
  #  Flag to use the server certificates
  UseServerCertificate = yes
  #  Configuration Server URL (This should point to the URL of at least one valid Configuration
  #  Service in your installation, for the primary server it should not used)
  ConfigurationServer = dips://dirac.cc.kek.jp:9135/Configuration/Server
  ConfigurationServer += dips://dirac1.cc.kek.jp:9135/Configuration/Server
  #  Configuration Name
  ConfigurationName = Belle

  #   These options define the DIRAC components being installed on "this" DIRAC server.
  #   The simplest option is to install a slave of the Configuration Server and a
  #   SystemAdministrator for remote management.
  #  The following options defined components to be installed
  #  Name of the installation host (default: the current host )
  #  Used to build the URLs the services will publish
  # Host = dirac.cern.ch
  Host = bellecs.heprc.uvic.ca
  #  List of Services to be installed
  Services  = Configuration/Server
  Services += Framework/SystemAdministrator

  # Additional Parameters from Miyake-san
  WebPortal = no
  Project = Belle
  RootPath = /opt/dirac
  UseServerCertificate = yes
  SkipCADownload = True

Get in touch with Miyake-san to ensure that your host certificate is accepted by the DIRAC server at KEK.

Installing Condor Compute Element with X509 Support

Some specialized condor submission modules are needed to allow DIRAC to pass its X509 credentials to the jobs. Add LocalComputingElement.py (skip it if it already exists) and CondorX509ComputingElement.py to


and condorx509ce to


Create the site director

With the DIRAC infrastructure in place

cd /opt/dirac/
source bashrc
dirac-install-agent WorkloadManagement SiteDirectorUVicTest -m SiteDirector -c y
ln -s /opt/dirac/runit/WorkloadManagement/SiteDirectorUVicTest /opt/dirac/startup/WorkloadManagement_SiteDirectorUVicTest

Server Update

Get in touch with Miyake-san to get your DN registered with the dirac_admin group. After that you should be able to create a proxy with the proper VOMS roles:

gb2_proxy_init -g dirac_admin

With those credentials you can log into the admin console for dirac and run the update:

gb2_proxy_init -g dirac_admin
update <version>

-- FrankBerghaus - 2014-06-27

Topic attachments
I Attachment History Action Size Date Who Comment
Texttxt CondorX509ComputingElement.py.txt r1 manage 0.8 K 2014-06-27 - 21:08 UnknownUser DIRAC Job Submission to Local Condor Queue with X509 Credentials
Texttxt LocalComputingElement.py.txt r1 manage 16.1 K 2014-06-27 - 21:08 UnknownUser DIRAC Job Submission to Local Condor Queue with X509 Credentials
Unknown file formatEXT condorx509ce r1 manage 5.6 K 2014-06-27 - 21:08 UnknownUser DIRAC Job Submission to Local Condor Queue with X509 Credentials
Edit | Attach | Watch | Print version | History: r6 < r5 < r4 < r3 < r2 | Backlinks | Raw View | More topic actions
Topic revision: r6 - 2014-07-03 - frank
This site is powered by the TWiki collaboration platform Powered by PerlCopyright © 2008-2021 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback