Random Thoughts

Views on life

Archive for October, 2010

Adding an additional hard disk in VirtualBox

Posted by Hemanta Banerjee on October 30, 2010

I came across this great appliance (http://www.oracle.com/technetwork/database/enterprise-edition/databaseappdev-vm-161299.html) from Oracle with all the basic oracle software such as the database and other developer tools already pre-installed. After importing the appliance I found that as usual it did not have enough disk space for me to play around. So the challenge for me was how to expand the disk space so that I could install all the Oracle middleware products such as Weblogic server and fusion middleware so that I could test out some of the new features in 11g OFM.

For those who do not know about VirtualBox, I would definitely recommend that you check out www.virtualbox.org. VirtualBox is a x86 and AMD64/Intel64 virtualization product very similar to VMWare. It is very feature rich and is the only solution that is freely available as Open Source Software. This was developed by Sun but has been embraced by Oracle as well.

The virtual box provided by Oracle had Oracle Enterprise Linux as the guest OS. Being a linux newbie searching google for steps to expand was frustrating to say the least. So I ended up adding a new hard disk that I plan to use for my new installations.

First add a new hard disk. Go to the virtual media manager as shown below and add a new disk.


In my case I added the disk with expandable storage so that I do not take up the space on my host machine. This will have some impact when I start adding adding data, but since I am using this for testing purposes only that should be fine.


Now we need to go ahead and add this disk to the virtual machine. Follow the steps below. Click on Settings and navigate to the Storage section and add the new hard disk as shown below. This will be added as a secondary slave.


Now after booting your guest you need to do some linux magic for the OS to recognize the hard disk. The 1st step is to determine the device name of the newly created disk. Run the command fdisk –l as the root and you should get an output that looks something like this

[root@localhost ~]# fdisk -l

Disk /dev/hda: 12.8 GB, 12884901888 bytes
255 heads, 63 sectors/track, 1566 cylinders
Units = cylinders of 16065 * 512 = 8225280 bytes

Device Boot      Start         End      Blocks   Id  System
/dev/hda1   *           1        1350    10843843+  83  Linux
/dev/hda2            1351        1566     1735020   82  Linux swap / Solaris

Disk /dev/hdb: 12.8 GB, 12884901888 bytes
255 heads, 63 sectors/track, 1566 cylinders
Units = cylinders of 16065 * 512 = 8225280 bytes

Device Boot      Start         End      Blocks   Id  System
/dev/hdb1   *           1        1566    12578863+  83  Linux

Disk /dev/hdd: 32.2 GB, 32212254720 bytes
16 heads, 63 sectors/track, 62415 cylinders
Units = cylinders of 1008 * 512 = 516096 bytes

Disk /dev/hdd doesn’t contain a valid partition table

/dev/hdd is the newly added disk. I am sure there is a better way to find this information but this worked for me. Now format the disk with the command. I can create a new extended partition by using the fdisk command. I create 1 extended partition for the entire disk. Now format the disk using the mkfs command as shown below.

mkfs -t ext3 /dev/hdd

This disk is now ready to be used. In my case I want to mount the new disk as /apps where I am going to install all the apps. So I need to create a new folder /apps and make the entries in my /etc/fstab


Now reboot and your disk is ready for use.

Posted in Linux, OEL, Virtual Box | Tagged: , , , , , , , | 3 Comments »

How to setup report scheduling and report distribution for BusinessObjects

Posted by Hemanta Banerjee on October 29, 2010

I came across this posting on the BOBJ board asking how to setup report distribution, but the reports should run only if the ETL process is successful. This ofcourse can be done using event based scheduling in BusinessObjects. So here you go – in this post I will talk about scheduling including event based scheduling. I will also talk about how to setup report distribution using EMAIL or to a folder.

Setting up Calendars

Let us first look at the various options for scheduling reports. BusinessObjects allows for scheduling reports either based on time, custom calendars or events. The simplest is to use the built in schedules provided out of the box.


These cover most of the common scenarios such as “1st day of the month” etc. For example the screenshot below shows that the report will run on the 3rd Monday of every month.


If none of these meet the requirements you can also setup custom calendars. For example let us say I want to run a report on the 3rd day of every quarter except Q4 where I want to run it on the 9th day of the quarter. This cannot be met by any of the standard calendars, so I will go ahead and define a custom calendar.


After setting the name I can go ahead and select the days when I want to run the report. I can either select specific days as shown below


I can also choose by day of the month. For example below the report will run on the 3rd and 10th day of every month.


Similarly I can also schedule for day of the week. For example below the report will run on the 1st thursday and third thursday and friday of every month.


While we can do a lot of fancy things I would recommend that we keep it simple. After defining the calendar we can go any report and schedule it using our custom calendar that we have defined.


Setting up Events

We can also schedule reports to be triggered on events. There are 3 kinds of events.


Custom Events: They have to be raised by calling the BusinessObjects SDK. Used to integrate reporting with other applications. We can have any application/web service use the BOE SDK to raise an event to trigger generation of a report.

File Events: BOE will looking for specific files, for example below the BOE will wait for a file called etl_complete.txt on d:\temp and will trigger the event if this file is found.


Scheduled Event: This is usually defined to define event chains. For example I want to run report 2 only if report 1 was successful.



Now let us setup a scenario. I want to run the P&L report only if the ETL is complete. And if the P&L report is run successfully then I will run the Balance Sheet report as well. Right click on the P&L report and select Schedule.


Since I want to test this I am scheduling it to “Now”. I am also setting the retry for 20 and retry interval to 60 seconds. This essentially means that the server will wait for 20 mins for the ETL Complete event to occur. 


I also setup the events as part of the scheduling setup. I want to wait for the ETL complete event. This event will be triggered by the ETL application by placing a file in d:\temp folder. Also if the report generation is successful I will trigger the P&L Complete event which can be used to trigger the Balance Sheet report.


Last few steps are setting the format and the destination which I have set as PDF and the report will be delivered to the default enterprise location for the user. Checking the status of the job I can see that it is waiting for the ETL complete event.


As soon as the ETL process places the file BOE automatically picks it up and kicks off the job.


And successfully completes the job.


Event Chaining

As discussed earlier I can also chain events. For example I have scheduled the Balance Sheet Report to run when the P&L Complete event is triggered.


The P&L report is waiting for the ETL to complete and the Balance Sheet report is now waiting for the P&L report to be complete. This is what I mean by event chaining.

image image


As soon as the ETL process places the trigger file it first kicks off the P&L report and if that is successful will trigger the balance sheet report.


If both are successful they will be delivered based on the settings for the job which in this example was the inbox for the user in infoview.


In a separate post I will talk more about how to configure other destinations such as email and FTP servers.

Posted in Administration, BusinessObjects, Custom Calendar, Distribution, Event | Tagged: , , , , , , , | 1 Comment »

How to configure IIS Tomcat connector (ISAPI connector) for BusinessObjects

Posted by Hemanta Banerjee on October 28, 2010

One of the frequent questions I run into on the BOBJ board is around setting up IIS and tomcat connector for BusinessObjects. I believe most of these posts are from customers using BusinessObjects Edge which does not support IIS as the web server. In order for this to work I used the Jakarta connector (AJP13) from Apache which allows IIS to forward specific requests to Tomcat. This is especially useful if you want to enable Windows Integrated Authentication for your IIS server and setup SSO with trusted authentication. This setup would enable the user to logon to InfoView without having to enter any user id or password.

Setting up ISAPI connector with IIS

The AJP13 Jakarta connector (version 1.2.14) can be found on Apache Software’s site at: http://archive.apache.org/dist/tomcat/tomcat-connectors/jk/binaries/win32/jk-1.2.14/. Download the isapi_redirect-1.2.14.exe.


This is a windows installer and after installation it would have setup a couple of things. By default the software is installed in “C:\Program Files\Apache Software Foundation\Jakarta Isapi Redirector”.


Installing and Configuring the Jakarta Connector on IIS

Step1: Configure the ISAPI connector configuration files. There are 2 main files to edit

workers.properties.minimal – This file provides configuration properties needed to connect to Tomcat. Find the “worker.ajp13w.host” and change the value from localhost to <machine_name>


uriworkermap.properties: This file contains all the mappings that will use by the ISAPI connector. There are some defaults. In the next step we will add the BusinessObjects related URL’s here.


I needed to add the following to this file as shown above



Step 2: Setup the virtual directory in IIS: The installer also defines the ISAPI redirector as a virtual directory in IIS as shown below. Check that execute permissions for this virtual directory is set to “Scripts and Executables”.


You also have to define the web service extension if using IIS. Right click on “Web Service Extensions” and define a new Web Service Extension called “Jakarta mod_jk” and required file as isapi_redirect.dll which can be found in the bin folder of Jakarta ISAPI redirector.


Now restart IIS using IISRESET.exe on the command line.

Installing and Configuring the Jakarta Connector on Tomcat

Now that you have the IIS ISAPI filter installed and configured on the IIS side, you have to configure Tomcat to accept connections from IIS. To do this, we will configure Tomcat’s AJP13 listener.

  1. On the Tomcat system (INSTALLDIR\Tomcat55) find the server.xml located in Tomcat’s \conf directory.
  2. Edit the server.xml. Search for the port=”8009” and uncomment and change the connector entry to look as follows

    <Connector enableLookups="false" port="8009" protocol="AJP/1.3" redirectPort="8443" tomcatAuthentication="false"/>

Save and restart Tomcat. If you do a netstat-an command on the command line you should see the tomcat connector listening on port 8009.


You can now test the connector by navigating to one of the tomcat samples such as the infoview URL on the webserver. For example: /InfoViewApp/">http://<iiswebserver>/InfoViewApp/.

Posted in Administration, IIS, ISAPI | Tagged: , , , , , , | 2 Comments »

How to configure NTLM for Crystal Reports Server/BusinessObjects Enterprise/Edge

Posted by Hemanta Banerjee on October 28, 2010

To simplify administration, BOE supports user and group accounts that are created using external directories such as LDAP, Active Directory and NT. In my previous post I had described the process for configuring LDAP authentication. Similar to setting up the authentication for LDAP the administrator needs to perform some basic setup to configure the server with the information needed to connect to the NTLM server.

Before I go to the setup needed in CMC, let me walk through some of the key concepts in a NTLM deployment. For my testing I am using the local windows users and groups. In order to make the administration simpler define a group in your NT server that will hold all the BOE users. I am calling it BOE_Users.


With that done now I can go to my CMC and enable NTLM authentication. The only difference between LDAP and NTLM is that in NT all I need to specify is the NT domain for user authentication.


For the meaning of the other settings please refer to the details on the LDAP post here. After selecting update the user and group will be added to the repository and now you can setup the access control for the user/group.


For details on how this works and how to enable the drop down allowing the user to select the authentication mode you can go to the LDAP posting. In a subsequent posting I will explain how to enable SSO and trusted authentication with NTLM which will allow the user to logon to Infoview without having to enter their user name and password.

Posted in Administration, Authentication, BusinessObjects, Installation, LDAP, NTLM | Tagged: , , , , , , | 3 Comments »

How to configure LDAP for Crystal Reports Server/BusinessObjects Enterprise/Edge

Posted by Hemanta Banerjee on October 21, 2010

To simplify administration, BOE supports user and group accounts that are created using external directories such as LDAP, Active Directory and NT. In my previous post I had described the process for configuring Windows NT authentication. Similar to setting up the authentication for NT the administrator needs to perform some basic setup to configure the server with the information needed to connect to the LDAP server.

Before I go to the setup needed in CMC, let me walk through some of the key concepts in a LDAP deployment. In my setup I have setup a freeware directory server Apache Directory. This is easy to setup and free so it works best for my testing. For the LDAP client I have used another freeware directory client from the Apache foundation called Apache Directory Studio.


You can easily create new users and groups using the directory studio and works great for the testing purposes. Ofcourse in a real production environment none of this would be necessary as you would be connecting to your corporate LDAP server. In order to make the administration simpler define a group in your LDAP server that will hold all the BOE users. I am calling it BOEUsers.


Now we can go to CMC and add this group in the LDAP configuration for BOE. You can add new authentication providers in BOE by clicking on the Authentication link in CMC.


After enabling LDAP, enter the connection details for the LDAP server. In my case I have installed on the localhost and my baseDN is dc=example,dc=com. You would also need to provide the logon credentials for the user which can be used by BOE to connect to the LDAP server for authentication. In my case I have used the admin ID however this is not essential. You can use any ID with read privileges on the baseDN.


You would also need to provide the LDAP group that will be mapped to BOE. I have used the BOEUsers group defined earlier. You will also need to define a couple of other parameters relating to how to map the users. For new deployments choose the options as shown below. If you want to map existing BOE users to LDAP userid’s then you can choose “Assign each added LDAP alias to an account with the same name” in the alias options. Also if your group mapped above contains only BOE users then the option selected in the Alias Update section will suffice. Otherwise if there are other users in the group who would not have access to BOE you can choose “Create new users when the user logs in” option in the Alias update options.


Now click on update. If you have selected the options above you will notice that the users from the LDAP group have now been imported as BOE users.


And the user has been mapped to the corresponding LDAP alias.


Now you can add this user to any BOE group for access control assignment and other security settings. The user can logon by selecting LDAP as the authentication mode in Infoview.


Enable Selection of Authentication Mode for Infoview and CMC

By default the authentication drop down is not displayed in Infoview. But you can enable it with a few settings. You need to change some settings on the web.xml file for your infoview application. The web.xml file is stored in the <INSTALLDIR>\Tomcat55\webapps\InfoViewApp\WEB-INF.web.xml and <INSTALLDIR>\Tomcat55\webapps\CmcApp\WEB-INF.web.xml.

To prompt users for the authentication type on the logon screen, locate the <authentication.visible> parameter and change its <param-value> from false to true. You would need to restart the Tomcat application server after this change.

<!– You can specify the default Authentication types here –>
<!– secEnterprise, secLDAP, secWinAD, secSAPR3 –>
<!– Choose whether to let the user change the authentication type –>
<!– If it isn’t shown the default authentication type from above will be used –>

Remember to stop TOMCAT and clear the tomcat cache at <INSTALLDIR>\Tomcat55\work\Catalina\localhost. Restart the application server and you should see the drop down for authentication providers in your Infoview logon page.

Now the user can logon to the system with their NT password.


How does LDAP Integration work

The diagram below is a summary of how LDAP authentication works between BusinessObjects and the LDAP server. When BOE is integrated with LDAP users and passwords are stored in LDAP and no longer defined in the Business Objects repository. The BusinessObjects clients authenticate against LDAP at runtime. The LDAP users inherit security from repository groups which are mapped to the LDAP group using repository group mapping.

iLDAP attribute (could be “role” attribute) to repository group mapping


The BusinessObjects LDAP users belong to group(s) that exist in the Business Objects repository. Access rights are attached to these repository profiles and to their parent groups. The authorization is made in two phases (1) At login, the system retrieves the list of security profiles associated to the user, by querying the LDAP corporate directory (2) Then the system computes the user access rights by combining the access rights associated to user security profiles in the repository.


Posted in Administration, Authentication, BusinessObjects, Installation, LDAP | Tagged: , , , , , | 6 Comments »

How to perform silent installation of BusinessObjects Enterprise/Edge

Posted by Hemanta Banerjee on October 21, 2010

A silent installation is the type of installation that you run from the command line to install BOE/Edge on any machine. It is very useful in federated scenarios where you want to ensure that the settings between the remote site and original are consistent. For more information on federation please look at my earlier blog post here.

The setup.exe for BOE can take the following parameters on command line to facilitate silent installation


To perform silent installation you would need to first create the .ini file. Steps for creating the ini file are

  1. Open a command line console and go to the directory which contains the BusinessObjects Enterprise setup.exe.
  2. In the command line, specify the file path and the .ini file to store the installation settings.
  3. Press Enter to launch the installation setup program. Follow the onscreen instructions to enter your preferred installation settings until you reach the final
    screen of the setup program. These settings will be recorded in the response file.
  4. Click Cancel to abort the installation setup. All the parameters -user-defined and default -from installation setup are passed to the .ini file which is stored in the directory specified in step 2.

Now you can create a batch routine and use the ini file created above for all future installations.

Posted in BusinessObjects, Installation | Tagged: , , , , , | Leave a Comment »

How to deploy BusinessObjects in a distributed environment (BOE Federation)

Posted by Hemanta Banerjee on October 21, 2010

Yesterday while talking to one of my shipping customers I realized that a traditional centralized BOE deployment would not work for them. They have multiple offices (Belgium, Luxemburg, Shanghai, Singapore). While the offices are all connected the bandwidth is an issue for them and they would like their users to use the WAN as little as possible.

While having separate deployments for each office is always an option, it also means that each BOE server would need to be managed separately – update reports in 4 places, manage security in 4 places etc.. you get the picture.

This is where Federation comes in. It is a great feature of BOE that allows BI administrators to create a process for copying BI content from one system to another while keeping the systems synchronized. The objectives of Federation are to:

  • Simplify the administration of multiple deployments.
  • Enable the distribution of content and/or security principals across geographically separated repositories
  • Avoid heavy use of Wide Area Network (WAN).

Essentially it is a great way to still manage the deployment in 1 place and replicate the content so that it can be accessed locally by the users. Federation enables administrators to distribute content and implement a consistent rights policy across this organization.One origin site can replicate an object to multiple destinations through multiple jobs. The only restriction is that you cannot replicate replicated content.


There are 2 main design patterns that can be used for such a deployment

(1) Centralized Design, distributed usage – In this business scenario, report creators and designers create reports, universes at the Origin Site. The Destination Sites then pull the content from the Origin Site and then reports are run at Destination Sites. Scheduling is performed at remote sites, instances stay at remote sites where the local databases reside. Localized instances can also be sent back to the Origin Site.

(2) Central Schedule, Distributed Access – In this case, all BI content is managed from a single origin site. This is where the data is and where the scheduling occurs. All pending job are sent to the origin site to run. The instances from the completed jobs are then sent back to the branch sites to be viewed.

Steps to create replication jobs.

Step 1: Create a Replication List on the Origin Site – A replication list is a list of objects to be replicated by a replication job. It only has pointers to the objects (reports, users, and universes) that have been selected for replication.

image image


Select the objects to be replicated. If you want to replicate the entire repository then just select “Replicate All repository objects” and click save.

Step 2: Create a Remote Connection on the destination site – Remote Connection objects contain the necessary information to connect to the remote BOE Server. this is The remote connection is create in the destination site. The remote BOE deployment is always considered the origin site and the BOE deployment where you create the remote connection object is always considered the destination site.


Step 3: Create a replication job on the destination site – A replication job is used to replicate content between two BOE Servers. A replication job must have the associated remote connection, the replication list.


Step 4: Schedule the Replication Job to run on the destination site

Couple of points to note:

1. Local data source connections have to be defined on the remote site. If the BOE server cannot find the definition of the local connections it will try to run the report against the origin site.

2. Do not try to use this for moving reports from developments to production. There is a separate LCM utility to do that. Details on that will come in a separate post.

3. Make sure that proper conflict resolution is set up to handle scenarios where reports can be modified both on the origin and destination site. Ideally the origin server should take precedence and will overwrite the changes at the destination.

More details on this can be found on the SDN website http://wiki.sdn.sap.com/wiki/display/BOBJ/Federation+in+BOE+XI+3.1


Posted in Administration, BusinessObjects, Federation | Tagged: , , | 1 Comment »

Auditing II: How to import the auditing reports in BOE

Posted by Hemanta Banerjee on October 18, 2010

In my last post here I had explained how to enable auditing in BOE. This post focuses on using the provided universe and reports that give a great jump start to analysing the audit database.

Thankfully SAP provides a universe and some sample reports. While they are not exactly 100% of what you would need it is a great start. The universe is bundled with the BOE samples and can be found in “Program Files\Business Objects\BusinessObjects Enterprise 12.0\Samples\universe”.


Open the Activity.unv file using your universe designer. You will have to set the database connection to point to your auditing database. This can be done by setting the connection information in the properties for the universe.


Export the universe to BOE repository. I have created a separate folder called “Audit” for placing the universe as well as any reports that I will subsequently create against the audit database.



The key folders in the universe are

Class/Object Purpose
Activity Captures all the activity details including name, duration, object accessed (universe/document),
Job Time Frame This is used for scheduled jobs
Session Analysis Used to analyze the logon session, duration, client IP
Server Information Used to analyze the BOE services such as job server, RAS used to respond to end user queries.


There are some reports also bundled with the samples (D:\Program Files\Business Objects\BusinessObjects Enterprise 12.0\Samples) which can be imported using the BIAR tool. Unfortunately the reports have not been updated and you are probably better off developing your own (unless you are just demo’ing). Some key reports that you might want to develop:

Number of User Sessions – This metric will allow you to trend the adoption of the system based on number of user sessions by week, month, quarter.

Peak Usage – Analyze the number of users accessing the system by hour to determine peak usage patterns.

Average Refresh Time – Will allow you to report to the stakeholders the average refresh time for each report and also the system wide refresh time. Will keep the whiners away.

Top 5/10 Longest Average Refresh Time by Document – Will help pinpoint reports that probably need some work

Top 10 Longest Refresh Instances By User – The report would help you to proactively identify which users are experiencing bad performance and proactively try to address their requirement.

Least/Most Accessed Documents

Posted in Administration, Audit, BusinessObjects | Tagged: , , , | 1 Comment »

Auditing I: How to enable audits logging in BOE XI 3.1

Posted by Hemanta Banerjee on October 18, 2010

One of the great new features in XI 3.1 is the auditing capabilities. The following new capabilities have been included in :

  • Provides client auditing for all two- and three-tier clients.
  • Captures IP address and machine name, if the name can be resolved.
  • Tracks which service triggered an event, rather than the server.
  • Audits the folder path so that, even if you have multiple reports with the same name, you know to which report the event refers.
  • Audits the parent-child relationships and parent CUID, which enables you to build reports that aggregate all operations to the level of a common parent.

What is the benefit of auditing – it brings BI administrator of the BOE environment enabling them to understand how the system is being used and make improvements in areas such as unused reports, long running reports etc. that need additional focus.

The first step is to turn on auditing and set the location of the audit database. The DSN for the audit database can be set on the CCM


You can verify the configuration in the “Configuration” tab for the SIA.


With the auditing database set, now you can logon to CMC and enable auditing for logon events by enabling auditing in the CMC properties.


Similarly you turn on auditing for other applications such as Desktop Intelligence, WEBI as shown below.


Note: You need to restart the services after you enable auditing

You can view the audit events in the auditing tables in the database. The key tables to watch are :

Table Purpose Description
APPLICATION_TYPE Metadata Applications
EVENT_TYPE Metadata English description of the event types
DETAIL_TYPE Metadata English description of the event detail
AUDIT_EVENT Data Table that captures all the events
AUDIT_DETAIL Data Detail table for Audit events

To test whether it is working open up Desktop Intelligence and see how it shows up in the audit trail.

For example the code below shows all the user generated events on my system

select AUDIT_EVENT.Event_ID,

AUDIT_DETAIL.Detail_Type_ID = DETAIL_TYPE.Detail_Type_ID and
AUDIT_EVENT.Event_Type_ID = dbo.EVENT_TYPE.Event_Type_ID
and User_Name <> 'System Account'


The CMS acts as the system auditor; the BusinessObjects Enterprise server that you monitor is an auditee. As the auditor, the CMS controls the overall audit process. Each server writes audit records to a log file local on the server. At regular intervals, the CMS communicates with the auditee servers to request copies of records from the auditee’s local log files. When the CMS receives these records it writes data from the log files to the central auditing database.

The CMS also controls the synchronization of audit actions that occur on different machines. Each auditee provides a time stamp for the audit actions that it records in its log file. To ensure that the time stamps of actions on different servers are consistent, the CMS periodically broadcasts its system time to the auditees. The auditees then compare this time to their internal clocks. If differences exist, the auditees correct the time stamps that are recorded in their log files for subsequent audit actions.

In a next post I will describe how you can use WEBI to generate reports and perform analysis on this data.

Posted in Administration, Audit, BusinessObjects | Tagged: , , , | 1 Comment »

Intelligently managing the BOE Server Intelligence Agent

Posted by Hemanta Banerjee on October 14, 2010

Server Intelligence Agent (SIA) is the major new feature of the new server architecture for BusinessObjects Enterprise XI 3.1. SIA the bootstrap service that in turns starts the Central Management Server as well as monitors the health of the Central Management Server.

Before we start just some terminologies used in BOE that might be useful

1. Service – A service is a subsystem that provides a function. For example Client Auditing Proxy Service provides client auditing gateway service. The service runs within the memory space of its container (e.g., Adaptive Job Server or Adaptive Processing Server) under the Process ID (PID) of the parent container.

2. Server – Server or Server Service is an OS level process hosting one or more services. For example, CMS and Adaptive Processing Server are servers. Adaptive Processing Server can host Client Auditing Proxy Service, Publishing Service, Search Service, and so on. A Server Service runs under a specific OS account and has its own PID.

In most of the cases we will be working with servers which are the physical processes when administering a BOE system. You can view the servers using the process explorer of windows.


Which servers are started depends on the settings in the CCM shown below


You can also manage the servers from the CMC interface.


One important point to note is that the not all servers might be needed in your environment. For example if I am using only Crystal reports then it is a good idea to disable the other services to improve performance.

A great feature that been added is the ability to clone servers inside the cluster. This can be useful even in a single node environment for example to create multiple JobServers to give higher priority to handle batch jobs. Administrators can now create new servers in the CMC using the “Clone” option on the Servers menu.


For all this to work, you need to atleast get the SIA and CMS processes up. One of the key challenges I have run into is how to troubleshoot if the SIA service itself fails to start up.

Fortunately BOE provides a way to enable tracing for the SIA. This can be turned on by adding the “-trace” option at the end of the command line. The tracing is going into default tracing folder: C:\Program Files\Business Objects\BusinessObjects Enterprise 12.0\Logging\sia*log* and the cms*trace*log.



Since the trace log file is very detailed would recommend that we turn that off during normal production run for performance and disk space considerations.

Posted in Administration, BusinessObjects | Tagged: , , | 1 Comment »

%d bloggers like this: