Copying files from local machine to remote HDFS cluster directly

I want to copy a file directly from my local Linux machine to a remote HDFS cluster i.e., I don't want to copy the file to the remote machine and then move it to HDFS using copyfromlocal command.

For this, I have executed the following command which I found it in an online blog.

cat sample.txt | ssh -tt root@MY_HDFS_CLUSTER "sudo -u hdfs hdfs dfs -put - /user/myuser/data/sample/sample.txt"

The issue I am facing while executing the command is that file is copied completely from my local machine to remote HDFS cluster but a file sample.txt.COPYING is created in remote HDFS cluster instead of sample.text file. The copy process runs indeterminately, in other words, it is not terminating and I have to kill the process using Ctrl+C.

It is much appreciated if someone help me resolve this issue.

Answers


How big is your sample.txt file? I think it is probably big and the process is not terminating because it hasn't actually finished copying the file yet.


Need Your Help

HTML5 hidden attribute not compatible with Bootstrap

css html5 twitter-bootstrap

I was refactoring some code and decided to change the usual style="display:none" to use the HTML5 hidden attribute, in order to hide a button. Only to find that it is not compatible with bootstrap'...

Assign tooltips by element IDs

javascript jquery jquery-ui

I've got 2 arrays.One is an array with button ids and the other is text