Hi @Bhavish. There is no Hadoop command to copy a local file to all/multiple directories in hdfs. But you can do it using a loop in the bash script.
Create a new bash script, I named my file copytoallhdfs.sh:
$ nano copytoallhdfs.sh
I came up with the following script for your problem statement:
#!/bin/bash
myarray=`hdfs dfs -ls -C /path/to/directory`
for name in $myarray
do hdfs dfs -copyFromLocal /path/to/source /$name; done
Save(Ctrl+o) and close(Ctrl+x) this file
Now make this script executable:
$ sudo chmod 777 copytoallhdfs.sh
and run the script:
$./copytoallhdfs.sh
This worked for me.