Skip to main content

Pentaho+ documentation has moved!

The new product documentation portal is here. Check it out now at docs.hitachivantara.com

 

Hitachi Vantara Lumada and Pentaho Documentation

Job entry reference

Entries extend and expand the functionality of Jobs. This section contains the list of supported entries.

A-F

NameCategoryDescription
Abort jobUtilityAbort the job.
Add filenames to resultFile managementAdd filenames to result.
Amazon EMR Job ExecutorBig DataExecute MapReduce jobs in Amazon EMR.
Amazon Hive Job ExecutorBig DataExecute Hive jobs in Amazon EMR.
Bulk load from MySQL into fileBulk loadingLoad from a MySQL table into a file.
Bulk load into Amazon RedshiftBulk loadingBulk loads files located in S3 buckets into an Amazon Redshift database.
Bulk load into MSSQLBulk loadingLoad data from a file into a MSSQL table.
Bulk load into MySQLBulk loadingLoad data from a file into a MySQL table.
Bulk load into SnowflakeBulk loadingLoad data from a file into a Snowflake data warehouse
Check Db connectionsConditionsCheck if we can connect to one or several databases.
Check files lockedConditionsCheck if one or several files are locked by another process.
Check if a folder is emptyConditionsCheck if a folder is empty.
Check if connected to repositoryRepositoryReturn true if we are connected to a repository.
Check if XML file is well formedXMLCheck if one or several XML files is/are well formed.
Check webservice availabilityConditionsCheck if a webservice is available.
Checks if files existConditionsChecks if files exist.
Columns exist in a tableConditionsCheck if one or several columns exist in a table on a specified connection.
Compare foldersFile managementCompare two folders (or two files).
Convert file between Windows and UnixFile managementConvert file content between Windows and Unix. Converting to Unix will replace CRLF (carriage return and line feed) by LF (line feed).
Copy FilesFile managementCopy files.
Copy or Move result filenamesFile managementCopy or move result filenames (since version 5.0, this job entry has been renamed to Process result filenames and it handles Delete as well).
Create a folderFile managementCreate a folder.
Create fileFile managementCreate an empty file.
Create Snowflake warehouseUtilityCreate a Snowflake virtual warehouse.
Decrypt files with PGPFile encryptionDecrypt files encrypted with PGP (Pretty Good Privacy). This job entry needs GnuPG to work properly.
Delete fileFile managementDelete a file.
Delete filenames from resultFile managementDelete filenames from result.
Delete filesFile managementDelete files.
Delete foldersFile managementDelete specified folders. If a the folder contains files, PDI will delete them all.
Delete Snowflake warehouseUtilityDrops a Snowflake warehouse.
Display Msgbox InfoUtilityDisplay a simple message information box.
DTD ValidatorXMLVerify if an XML file corresponds to a certain structure or format.
DummyGeneral Use the Dummy job entry to do nothing in a job.
Encrypt files with PGPFile encryptionEncrypt files with PGP (Pretty Good Privacy). This job entry needs GnuPG to work properly.
Evaluate files metricsConditionsEvaluate files size or files count.
Evaluate rows number in a tableConditionsEvaluate the content of a table. You can also specify an SQL query.
Example job (deprecated)DeprecatedIs an example test job entry for a plugin.
Export repository to XML fileRepositoryExport repository to XML file.
File CompareFile managementCompare two files.
File exists (Job Entry)ConditionsCheck if a file exists.
FTP DeleteFile transferDelete files in a remote host.

G-L

NameCategoryDescription
Get a file with FTPFile transferGet files using FTP (File Transfer Protocol).
Get a file with FTPSFile transferGet a file with FTP secure.
Get a file with SFTPFile transferGet files using SFTP (Secure File Transfer Protocol).
Get mails (POP3/IMAP)MailGet mails (POP3/IMAP) server and save into a local folder.
Google BigQuery LoaderBig DataLoad data into Google BigQuery from a Google Cloud Storage account.
Hadoop Copy FilesBig Data Copies files in a Hadoop cluster from one location to another.
Hadoop job executorBig Data Execute a map/reduce job contained in a jar file.
HL7 MLLP AcknowledgeUtility Acknowledge HL7 messages.
HL7 MLLP InputUtility Read data from HL7 data streams within a transformation.
HTTPFile managementGet or upload a file using HTTP (Hypertext Transfer Protocol).
JavaScriptScriptingEvaluate the result of the execution of a previous job entry.
Job (job entry)GeneralExecute a job.

M-R

NameCategoryDescription
MailMailSend an email.
Mail validatorMailCheck the validity of an email address (SNMP trap to a target host.)
Modify Snowflake warehouseUtilityModify a Snowflake virtual warehouse.
Move FilesFile managementMove files.
MS Access bulk load (deprecated)DeprecatedLoad data into a Microsoft Access table from a CSV file format. Replaced by Microsoft Access Output Step.
Oozie Job ExecutorBig Data Execute Oozie workflows.
Palo cube create (deprecated)DeprecatedCreate a cube on a Palo server.
Palo cube delete (deprecated)DeprecatedDelete a cube on a Palo server.
Pentaho MapReduceBig DataExecute transformation-based MapReduce jobs in Hadoop.
Pig Script ExecutorBig DataExecute a Pig script on a Hadoop cluster.
Ping a hostUtilityPing a host.
Put a file with FTPFile transferPut a file with FTP.
Process result filenamesFile managementCopy, move, or delete result filenames.
Put a file with SFTPFile transferPut files using SFTP (Secure File Transfer Protocol).

S-Z

NameCategoryDescription
Send information using SyslogUtilitySend information to another server using the Syslog protocol.
Send Nagios passive checkUtility Send Nagios passive checks.
Send SNMP trapUtilitySend SNMP trap to a target host.
Set variablesGeneralSet one or several variables.
ShellScriptingExecute a shell script.
Simple evaluationConditionsEvaluate one field or variable.
Spark SubmitBig DataSubmit Spark jobs to Hadoop clusters.
SQLScriptingExecutes SQL on a certain database connection.
Sqoop ExportBig DataExport data from the Hadoop Distributed File System (HDFS) into a relational database (RDBMS) using Apache Sqoop.
Sqoop ImportBig DataImport data from a relational database (RDBMS) into the Hadoop Distributed File System (HDFS) using Apache Sqoop.
SSH2 Get (deprecated)DeprecatedGet files using SSH2 (Deprecated in 5.0 in favor of the SFTP job entry).
SSH2 Put (deprecated)DeprecatedPut files in a remote host using SSH2 (Deprecated in 5.0 in favor of the SFTP job entry).
StartGeneral Defines the starting point for job execution. Every job must have one (and only one) Start.
Start a PDI Cluster on YARNBig DataStart a PDI Cluster on YARN.
Stop a PDI Cluster on YARNBig DataStop a PDI Cluster on YARN.
Start Snowflake warehouseUtilityResumes a Snowflake warehouse.
Stop Snowflake warehouseUtilitySuspends a Snowflake warehouse.
SuccessGeneralClear any error state encountered in a job and forces it to a success state.
Table existsConditionsCheck if a table exists on a database connection.
Talend job execution (deprecated)DeprecatedExecute an exported Talend job.
Transformation (job entry)GeneralRun a transformation.
Truncate tablesUtilityTruncate one or several tables.
Unzip fileFile managementUnzip file in a target folder.
Upload files to FTPSFile transferUpload files to a FTP secure.
Verify file signature with PGPFile encryptionVerify file signature with PGP (Pretty Good Privacy). This job entry needs GnuPG to work properly.
Wait forConditionsWait for a delay.
Wait for fileFile managementWait for a file.
Wait for SQLUtilityScan a database and success when a specified condition on returned rows is true.
Write to fileFile managementWrite text content to file.
Write To LogUtilityWrite message to log.
XSD ValidatorXMLPerform an XSD validation against data in a file or in an input field.
XSL TransformationXMLMake an XSL transformation.
Zip fileFile managementZip files from a directory and process files.