Skip to main content

Pentaho+ documentation has moved!

The new product documentation portal is here. Check it out now at docs.hitachivantara.com

 

Hitachi Vantara Lumada and Pentaho Documentation

Hadoop Copy Files

Parent article

This job entry copies files in a Hadoop cluster from one location to another.

General

Enter the following information in the job entry field:

  • Job Name: Specify the unique name of the Hadoop Copy Files entry on the canvas. You can customize the name or leave it as the default.

Options

The Hadoop Copy Files job entry features two tabs with fields. Each tab is described below.

Files/Folders tab

Files/Folders tab, Hadoop Copy Files

OptionDescription
Source EnvironmentSpecify the type of file system containing the files you want to copy.
Source File/FolderSpecify the file or directory you want to copy. Click Browse to to navigate to the source file or folder through the VFS browser. See VFS browser for more information.
Wildcard (RegExp)Specify the files to copy with regular expressions instead of static file names. For example, .*\.txt selects all files with a .txt extension.
Destination EnvironmentSpecify the file system where you want to put your copied files.
Destination File/FolderSpecify the file or directory where you want to place your copied file. Click Browse and select Hadoop to enter your Hadoop cluster connection details.

NoteThe source environment and destination environments must be the same.

Settings tab

Settings tab, Hadoop Copy Files

OptionDescription
Include subfoldersSelect to copy all subdirectories in the chosen directory.
Destination is a fileSelect to specify the destination is a file.
Copy empty foldersSelect to copy empty directories. The Include Subfolders option must be selected for this option to be valid.
Create destination folderSelect to create the specified destination directory if it does not exist.
Replace existing filesSelect to overwrite duplicate files in the destination directory.
Remove source filesSelect to remove the source files after copying them. This is equivalent to a move procedure.
Copy previous results to argumentsSelect to use previous step results as your sources and destinations.
Add files to result files nameSelect to create a list of files that were copied in this step.

If you are not using Kerberos security, this step sends the username of the logged-in user when copying the files regardless of the username entered in the connect field. To change the username, set the environment variable HADOOP_USER_NAME to the username you want to use. You can set the username by changing the OPT variable in the spoon.bat or spoon.sh file as shown in the following example:

OPT="$OPT .... -DHADOOP_USER_NAME=HadoopNameToSpoof"