Install DI Operations Mart
Installation of DI Ops Mart depends on the following conditions and prerequisites:
- The PostgreSQL database is the only supported database for the DI Operations Mart.
- The DI Operations Mart is installed by default when you have used the installation wizard.
- If you have an existing 7.1 installation (or if you upgraded to the 7.1 installation) of the PDI client (Spoon) and the Pentaho Server and do not have the DI Operations Mart installed, install it by running the
pentaho_mart_postgresql.sql
script located in the /pentaho-server/data/postgresql folder. - If you have an existing 7.0 installation of the the PDI client (Spoon) and the Pentaho Server, you must configure your system to use the DI Operations Mart. Perform the following steps to install the DI Operations Mart on a system that has an 7.0 installation of the Pentaho Server:
- Run the Setup Script
- Set the Global Kettle Logging Variables
- Add JNDI Connections
- Populate the Data and Time Dimensions in Operations Mart
- Populate the Logging Dimensions in the DI Operations Mart
The PostgreSQL database is the only supported database for the DI Operations Mart.
Step 1: Run the Setup Script
Run the pentaho-server/data/postgresql/pentaho_logging_PostgreSQL.sql
script to create the pentaho_dilog
schema and logging tables in the Hibernate database.
Step 2: Set the Global Kettle Logging Variables
Perform this step on the computer where you have installed your PostgreSQL database. This may or may not be the same computer where you have the Pentaho Server installed.
The PostgreSQL database is the only supported database for the DI Operations Mart.
In the PDI client, add or edit the following variables and values in the kettle.properties
file by choosing Edit > Edit the kettle.properties file to reflect the values in the following table:
Variable | Value |
---|---|
KETTLE_CHANNEL_LOG_DB | live_logging_info |
KETTLE_CHANNEL_LOG_TABLE | channel_logs |
KETTLE_CHANNEL_LOG_SCHEMA | pentaho_dilogs |
KETTLE_JOBENTRY_LOG_DB | live_logging_info |
KETTLE_JOBENTRY_LOG_TABLE | jobentry_logs |
KETTLE_JOBENTRY_LOG_SCHEMA | pentaho_dilogs |
KETTLE_JOB_LOG_DB | live_logging_info |
KETTLE_JOB_LOG_TABLE | job_logs |
KETTLE_JOB_LOG_SCHEMA | pentaho_dilogs |
KETTLE_METRICS_LOG_DB | live_logging_info |
KETTLE_METRICS_LOG_TABLE | metrics_logs |
KETTLE_METRICS_LOG_SCHEMA | pentaho_dilogs |
KETTLE_STEP_LOG_DB | live_logging_info |
KETTLE_STEP_LOG_TABLE | step_logs |
KETTLE_STEP_LOG_SCHEMA | pentaho_dilogs |
KETTLE_TRANS_LOG_DB | live_logging_info |
KETTLE_TRANS_LOG_TABLE | trans_logs |
KETTLE_TRANS_LOG_SCHEMA | pentaho_dilogs |
KETTLE_TRANS_PERFORMANCE_LOG_DB | live_logging_info |
KETTLE_TRANS_PERFORMANCE_LOG_TABLE | transperf_logs |
KETTLE_TRANS_PERFORMANCE_LOG_SCHEMA | pentaho_dilogs |
Step: 3 Add JNDI Connections
This section explains how to add a JNDI connection for a PDI client:
Navigate to the pentaho/design-tools/data-integration/simple-jndi
folder and open the jdbc.properties
file with a text editor. Edit the file to reflect the values in the following table:
pentaho_operations_mart/type |
javax.sql.DataSource |
pentaho_operations_mart/driver |
org.postgresql.Driver |
pentaho_operations_mart/url |
|
pentaho_operations_mart/user |
hibuser |
pentaho_operations_mart/password |
password |
live_logging_info/type |
javax.sql.DataSource |
live_logging_info/driver |
org.postgresql.Driver |
live_logging_info/url |
|
live_logging_info/user |
hibuser |
live_logging_info/password |
password |
The URLs, users, and passwords values must be obtained from your system administrator.
This section explains how to add a JNDI connection for the Pentaho Server. Perform this task on the computer where you have installed the Pentaho Server.
Navigate to the pentaho/server/pentaho-server/tomcat/webapps/Pentaho/META-INF/
folder and open the context.xml
file with a text editor. Edit the file to reflect the values in the following example:
<Resource name="jdbc/pentaho_operations_mart" auth="Container" type="javax.sql.DataSource" factory="org.apache.commons.dbcp.BasicDataSourceFactory" maxTotal="20" maxIdle="5" maxWaitMillis="10000" username="hibuser" password="password" driverClassName="org.postgresql.Driver" url="jdbc:postgresql://localhost:5432/hibernate" validationQuery="select 1"/> <Resource name="jdbc/live_logging_info" auth="Container" type="javax.sql.DataSource" factory="org.apache.commons.dbcp.BasicDataSourceFactory" maxTotal="20" maxIdle="5" maxWaitMillis="10000" username="hibuser" password="password" driverClassName="org.postgresql.Driver" url="jdbc:postgresql://localhost:5432/hibernate?searchpath=pentaho_dilogs" validationQuery="select 1"/>
The URLs, user, and password values must be obtained from your system administrator.
Step 4: Populate the Date and Time Dimensions in Operations Mart
On the computer where you have installed your PostgreSQL database, you must populate the Date and Time dimensions in the Operations Mart. Run the Fill in DIM Date and DIM Time
job by performing the following steps:
-
From the Pentaho User Console, select View > Show Hidden Files.
-
Select the Browse Files > Public > Pentaho Operations Mart > DI Ops Mart ETL > Fill_in_DIM_DATE_and_DIM_TIME job file.
Step 5: Populate the Logging Dimensions in the DI Operations Mart
Run the Update_Dimensions_then_Logging_Datamart
job on the computer where you have installed your PostgreSQL database to finish setting up the DI Operations Mart:
- From the Pentaho User Console, select View > Show Hidden Files.
- Select the Browse Files > Public > Pentaho Operations Mart > DI Ops Mart ETL > Update_Dimensions_then_Logging_Datamart job file.
The DI Operations Mart is now set up and ready to populate with data. The DI Operations Mart reports can be found in the/Public/Pentaho Operations Mart/DI Audit Reports
folder. The Update_Dimensions_then_Logging_Datamart
job should be run prior to running reports so that the Operations Mart has the latest activity data. You may also want to schedule this job to run so that the data mart is periodically updated.