I am following a hdfs tutorial for learning different hdfs commands. So far I am able to create a directory in hdfs using the command:
hdfs dfs -mkdir -p sales/january
But, I am getting error while copying the data from local file system to HDFS. I am following all the steps mentioned in the tutorial. Here is the screenshots:
- Here, is my HDFS directories:
- Here, is the directory where I want to copy the file
![image](https://lh5.googleusercontent.com/fN-nFpq-uPkTlvpOuV4sDQRgdZ1QmgE-4Nj7SvABpuBNIrbNVGs0m3ug4hc8OwNaI570VVmoBpHnXFlns3YT-T-DiaS3lLyTZFK_O_0zYps0QONucmrPE8Kxp_vYNTQP2P2u1y_q)
Now, I am issuing the following command:
hdfs dfs -put january_sales_2017.csv sales/january/january_sale_2017.csv
The error that I am getting is:
put: `sales/january/january_sale_2017.csv': No such file or directory: `hdfs://localhost:8020/sales/january/january_sale_2017.csv`
Please, let me know what I am doing wrong or is there any problem with the hadoop setup. Any help would be appreciated.