You can define additional Spark properties in two ways:
- Using the application.properties file
- Using run modification parameters within a transformation
Any property that starts with
spark. is passed directly to the Spark
Use the properties file
Within the application.properties file, you may add any number of Spark configuration properties, as detailed at: https://spark.apache.org/docs/latest/configuration.html
Use run modification parameters
You may also define any additional Spark configuration properties as run modification parameters within a transformation.
Order of processing
Because Spark properties can be set in multiple ways, it is important to understand the following order of processing:
- Spark properties set on the Pentaho Server apply to all users and all transformations.
- Spark properties specified within a KTR apply to the specific user who runs the transformation. For example, if a user wants to change the memory size, they can embed the appropriate Spark parameter in the KTR so that it executes when they run the transformation.