0
I am using putty ssh to import my csv file to Hadoop file system (HDFS). So far I have made a directory using the command
hadoop fs -mdkir /data
after the directory I am trying to import my csv file using command:
hadoop fs -cp s3://cis4567-fall19/Hadoop/SalesJan2 009.csv
However I am getting a error that states :
-cp: Not enough arguments: expected 2 but got 1
First of all "cp" needs two arguments:-
1.) Source where the file is present (In your case s3://cis4567-fall19/Hadoop/SalesJan2009.csv).
2.) destination where you need to copy the file (In your case /data on hdfs).
You have only given a source and no destination has been specified by you.
And secondly even if you use "cp" correctly it will not copy from S3 to HDFS directly, it will give an error.
To directly copy a file from S3 to HDFS you need to use "distcp" command. Your command would look something like this:
"Hadoop distcp s3n:/cis4567-fall19/Hadoop/SalesJan2009.csv /data"
Get Answers For Free
Most questions answered within 1 hours.