How to find the port on which hdfs is running

0 votes

I am trying to access my hdfs using fully qualified name. The syntax is:

hdfs://machine-name:port/

But I don't know on which port my hdfs is running. How to find it?

Jan 25, 2019 in Big Data Hadoop by Tamanna
3,802 views

1 answer to this question.

0 votes

If you are using hadoop 2.7 or below then use this:

hdfs getconf -confKey fs.default.name

If you are using higher versions, then use this:

hdfs getconf -confKey fs.defaultFS
answered Jan 25, 2019 by Omkar
• 69,220 points

Related Questions In Big Data Hadoop

0 votes
1 answer

How can I use my host machine’s web browser to check my HDFS services running in the VM?

The sole purpose of the virtual machine ...READ MORE

answered Apr 18, 2018 in Big Data Hadoop by Shubham
• 13,490 points
1,431 views
0 votes
1 answer

What is the command to find the free space in HDFS?

You can use dfsadmin which runs a ...READ MORE

answered Apr 29, 2018 in Big Data Hadoop by Shubham
• 13,490 points
2,179 views
0 votes
1 answer

How to find the used cache in HDFS

hdfs dfsadmin -report This command tells fs ...READ MORE

answered May 4, 2018 in Big Data Hadoop by Shubham
• 13,490 points
2,466 views
0 votes
1 answer

How to check your file is created on HDFS?

Hey,  There is a process or steps you ...READ MORE

answered May 7, 2019 in Big Data Hadoop by Gitika
• 65,770 points
1,081 views
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
11,028 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
2,536 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
108,832 views
–1 vote
1 answer

Hadoop dfs -ls command?

In your case there is no difference ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by kurt_cobain
• 9,350 points
4,612 views
0 votes
1 answer

How to find the number of blocks a hdfs file is divided into?

Yes. you can use the hadoop fsck command to do ...READ MORE

answered Nov 30, 2018 in Big Data Hadoop by Omkar
• 69,220 points
5,907 views
0 votes
1 answer

How to find the running namenodes and secondary name nodes in hadoop?

Name nodes: hdfs getconf -namenodes Secondary name nodes: hdfs getconf ...READ MORE

answered Nov 26, 2018 in Big Data Hadoop by Omkar
• 69,220 points
2,837 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP