Can use pipe from wget to hdfs.
You might face problem as gz files are not splittable, this will stop you from running distributed MapReduce code over it.
I would suggest to download file in a local system, then unzip the file and then use pipe operator.
cat test123.txt | ssh uname@master "hadoop dfs -put - FolderName/test123.txt"