88353/how-to-find-the-number-of-blocks-for-a-file-in-hadoop
Hi Team,
I have configured the Hadoop Cluster in the Local system. I want to know the number of blocks for a file in Hadoop. How can I do that?
Hi@akhtar,
You can use Hadoop file system command to know the blocks for the specific file. To view the blocks for the specific file you can use the below command.
$ hadoop fsck /path/to/file -files -blocks
Yes. you can use the hadoop fsck command to do ...READ MORE
You can use the hadoop fs -ls command to ...READ MORE
Hey, The example uses HBase Shell to keep ...READ MORE
Well, what you can do is use ...READ MORE
You can use dfsadmin which runs a ...READ MORE
hdfs dfsadmin -report This command tells fs ...READ MORE
Firstly you need to understand the concept ...READ MORE
Hi, You can create one directory in HDFS ...READ MORE
Hi@sonali, It depends on what kind of testing ...READ MORE
Hi@akhtar, You can use the Chown command. This ...READ MORE
OR
At least 1 upper-case and 1 lower-case letter
Minimum 8 characters and Maximum 50 characters
Already have an account? Sign in.