[hadoop] hadoop copy a local file system folder to HDFS

I need to copy a folder from local file system to HDFS. I could not find any example of moving a folder(including its all subfolders) to HDFS

$ hadoop fs -copyFromLocal /home/ubuntu/Source-Folder-To-Copy HDFS-URI

This question is related to hadoop hdfs

The answer is


You can use :

1.LOADING DATA FROM LOCAL FILE TO HDFS

Syntax:$hadoop fs –copyFromLocal

EX: $hadoop fs –copyFromLocal localfile1 HDIR

2. Copying data From HDFS to Local

Sys: $hadoop fs –copyToLocal < new file name>

EX: $hadoop fs –copyToLocal hdfs/filename myunx;


Navigate to your "/install/hadoop/datanode/bin" folder or path where you could execute your hadoop commands:

To place the files in HDFS: Format: hadoop fs -put "Local system path"/filename.csv "HDFS destination path"

eg)./hadoop fs -put /opt/csv/load.csv /user/load

Here the /opt/csv/load.csv is source file path from my local linux system.

/user/load means HDFS cluster destination path in "hdfs://hacluster/user/load"

To get the files from HDFS to local system: Format : hadoop fs -get "/HDFSsourcefilepath" "/localpath"

eg)hadoop fs -get /user/load/a.csv /opt/csv/

After executing the above command, a.csv from HDFS would be downloaded to /opt/csv folder in local linux system.

This uploaded files could also be seen through HDFS NameNode web UI.


If you copy a folder from local then it will copy folder with all its sub folders to HDFS.

For copying a folder from local to hdfs, you can use

hadoop fs -put localpath

or

hadoop fs -copyFromLocal localpath

or

hadoop fs -put localpath hdfspath

or

hadoop fs -copyFromLocal localpath hdfspath

Note:

If you are not specified hdfs path then folder copy will be copy to hdfs with the same name of that folder.

To copy from hdfs to local

 hadoop fs -get hdfspath localpath

To copy a folder file from local to hdfs, you can the below command

hadoop fs -put /path/localpath  /path/hdfspath

or

hadoop fs -copyFromLocal /path/localpath  /path/hdfspath

You could try:

hadoop fs -put /path/in/linux /hdfs/path

or even

hadoop fs -copyFromLocal /path/in/linux /hdfs/path

By default both put and copyFromLocal would upload directories recursively to HDFS.


In Short

hdfs dfs -put <localsrc> <dest>

In detail with example:

Checking source and target before placing files into HDFS

[cloudera@quickstart ~]$ ll files/
total 132
-rwxrwxr-x 1 cloudera cloudera  5387 Nov 14 06:33 cloudera-manager
-rwxrwxr-x 1 cloudera cloudera  9964 Nov 14 06:33 cm_api.py
-rw-rw-r-- 1 cloudera cloudera   664 Nov 14 06:33 derby.log
-rw-rw-r-- 1 cloudera cloudera 53655 Nov 14 06:33 enterprise-deployment.json
-rw-rw-r-- 1 cloudera cloudera 50515 Nov 14 06:33 express-deployment.json

[cloudera@quickstart ~]$ hdfs dfs -ls
Found 1 items
drwxr-xr-x   - cloudera cloudera          0 2017-11-14 00:45 .sparkStaging

Copy files HDFS using -put or -copyFromLocal command

[cloudera@quickstart ~]$ hdfs dfs -put files/ files

Verify the result in HDFS

[cloudera@quickstart ~]$ hdfs dfs -ls
Found 2 items
drwxr-xr-x   - cloudera cloudera          0 2017-11-14 00:45 .sparkStaging
drwxr-xr-x   - cloudera cloudera          0 2017-11-14 06:34 files

[cloudera@quickstart ~]$ hdfs dfs -ls files
Found 5 items
-rw-r--r--   1 cloudera cloudera       5387 2017-11-14 06:34 files/cloudera-manager
-rw-r--r--   1 cloudera cloudera       9964 2017-11-14 06:34 files/cm_api.py
-rw-r--r--   1 cloudera cloudera        664 2017-11-14 06:34 files/derby.log
-rw-r--r--   1 cloudera cloudera      53655 2017-11-14 06:34 files/enterprise-deployment.json
-rw-r--r--   1 cloudera cloudera      50515 2017-11-14 06:34 files/express-deployment.json

From command line -

Hadoop fs -copyFromLocal

Hadoop fs -copyToLocal

Or you also use spark FileSystem library to get or put hdfs file.

Hope this is helpful.