This article explains how to configure the Pentaho Server to connect to a Hortonworks Hadoop 2.5 cluster to use secure impersonation. For an overview of secure impersonation, refer to Setting Up Big Data Security. The following sections will guide you through the setup and configuration process:
- Parameter Configuration
- Configuring MapReduce Jobs (Windows-only)
- Next Steps
The following requirements must be met to use secure impersonation:
- The cluster must be secured with Kerberos, and the Kerberos server used by the cluster must be accessible to the Pentaho Server.
- The Pentaho computer must have Kerberos installed and configured as explained in Set Up Kerberos for Pentaho.
If your system has version 8 of the Java Runtime Environment (JRE) or the Java Developer's Kit (JDK) installed, you will not need to install the Kerberos client, since it is included in the Java installation. You will need to modify the Kerberos configuration file, krb5.conf, as specified in the Set Up Kerberos for Pentaho topic.
- Pentaho shims for client and server must be configured for each component as explained in Edit Secured Cluster Configuration Properties.
Follow the instructions for editing the config.properties file below instead of the instructions in the Edit config.properties (Secured Clusters) section of the Set up Pentaho to Connect to a Hortonworks Cluster article.
The mapping types value in the
config.properties file turns secure impersonation on or off. The mapping types supported by the Pentaho Server are disabled and simple. When set to disabled or left blank, the Pentaho Server does not use authentication. When set to simple, the Pentaho users can connect to the Hadoop cluster as a proxy user.
To configure the cluster for secure impersonation, stop the Pentaho Server and complete the following steps:
- Navigate to the pentaho-server\pentaho-solutions\system\kettle\plugins\pentaho-big-data-plugin\hadoop-configurations\
hdp25folder and open the
config.propertiesfile with a text editor.
- Modify the config.properties file with the values in the following table,:
Parameter Value pentaho.authentication.default.kerberos.principal pentaho.authentication.default.kerberos.keytabLocation Set the Kerberos keytab. You only need to set the password or the keytab, not both. pentaho.authentication.default.kerberos.password Set the Kerberos password. You only need to set the password or the keytab, not both. pentaho.authentication.default.mapping.impersonation.type simple pentaho.authentication.default.mapping.server.credentials.kerberos.principal pentaho.authentication.default.mapping.server.credentials.kerberos.keytabLocation You only need to set the password or the keytab, not both. pentaho.authentication.default.mapping.server.credentials.kerberos.password You only need to set the password or the keytab, not both. pentaho.oozie.proxy.user Add the proxy user's name if you plan to access the Oozie service through a proxy. Otherwise, leave it set to oozie.
In this table, exampleUser@EXAMPLE.COM is provided as a sample of how you would specify your proxy user. If you have key-value pairs in your existing config.properties file that are not security related, merge those settings into the file.
- Save and close the config.properties file.
- Copy the
config.propertiesfile to the following folders:
- Restart the Pentaho Server.
Configuring MapReduce Jobs
For Windows systems, you must modify the mapred-site.xml files to run MapReduce jobs with secure impersonation. Complete the following steps to modify the files:
- Navigate to the design-tools\data-integration\plugins\pentaho-big-data-plugin\hadoop-configurations\hdp25 folder and open the mapred-site.xml file with a text editor.
- Navigate to the pentaho-server\pentaho-solutions\system\kettle\plugins\pentaho-big-data-plugin\hadoop-configurations\hdp25 folder and open the mapred-site.xml file with a text editor.
- Add the following two properties to the two mapred-site.xml files:
<property> <name>mapreduce.app-submission.cross-platform</name> <value>true</value> </property> <property> <name>mapreduce.framework.name</name> <value>yarn</value> </property>
- Save and close the files.
When you save your changes in the repository and your Hadoop cluster is connected to the Pentaho Server, you can now ready to use secure impersonation to run your transformations and jobs from the Pentaho Server.
Secure impersonation from the PDI client is not currently supported.