The behaviour that you are seeing is expected, let me explain what's going on when you are working with hadoop fs commands.
The exact command's syntax is this: hadoop fs -ls [path]
By default, when you don't specify [path] for the above command, hadoop expands the path to /home/[username] in hdfs; where [username] gets replaced with linux username who is executing the command.
So, when you execute this command:
ubuntu@xadsam-master:~$ hadoop fs -ls
the reason you are seeing the error is ls: '.': No such file or directory because hadoop is looking for this path /home/ubuntu, it seems like this path doesn't exist in hdfs.
The reason why this command:
ubuntu@sam-master:~$ hadoop fs -ls hdfs://101-master:50000/
is working because, you have explicitly specified [path] and is the root of the hdfs. You can also do the same using this:
ubuntu@sam-master:~$ hadoop fs -ls /
which automatically gets evaluated to the root of hdfs.
Hope, this clears the behaviour you are seeing while executing hadoop fs -ls command.
Hence, if you want to specify local file system path use file:/// url scheme.
Hope this will clear all your doubts.