Can anyone help me in installing and configuring a Multi-Node Hadoop Cluster

0 votes
Am a fresher in Hadoop Technology and I have recently finished my certification in Hadoop, I wish to learn how to set up a multi-node Hadoop cluster in windows. My requirement is for setting up 3 slave nodes and 1 master node. Am unable to do it even after spending many hours surfing on the internet. I have also tried the AWS which hardly worked. I am even ready to switch the OS, Help me out on this one.
Jun 4, 2019 in Big Data Hadoop by nitinrawat895
• 11,380 points
1,367 views

1 answer to this question.

0 votes

To install Hadoop setup on the 4-node cluster, with the requirement of 1 master 3 slaves. You must follow the following Steps as mentioned below:

Step 1: Get rid of windows. Currently Hadoop is available for Linux machines. You can have ubuntu 14.04 or later versions (or CentOS, Redhat etc)

Step 2: Install and setup Java $ sudo apt-get install python-software-properties $ sudo add-apt-repository ppa:ferramroberto/java $ sudo apt-get update $ sudo apt-get install sun-java6-jdk

# Select Sun's Java as the default on your machine.
# See 'sudo update-alternatives --config java' for more information.    
#
$ sudo update-java-alternatives -s java-6-sun

Step 3: Set the path in .bashrc file (open this file using text editor(vi/nano) and append the below text)

export JAVA_HOME=/usr/local/jdk1.7.0_71
export PATH=PATH:$JAVA_HOME/bin

Step 4: Add a dedicated user (While that’s not required it is recommended)

# useradd hadoop 
# passwd hadoop

Step 5: Edit hosts file in /etc/ folder on all nodes, specify the IP address of each system followed by their host names.( open the file in using vi /etc/hosts and append the text below --

<ip address of master node> hadoop-master 
<ip address of slave node 1> hadoop-slave-1 
<ip address of slave node 2> hadoop-slave-2
<ip address of slave node 3> hadoop-slave-3

Step 6: Setup ssh in every node such that they can communicate with one another without any prompt for the password.

$ su hadoop
$ ssh-keygen -t rsa 
$ ssh-copy-id -i ~/.ssh/id_rsa.pub hadoop@hadoop-master 
$ ssh-copy-id -i ~/.ssh/id_rsa.pub hadoop_tp1@hadoop-slave-1 
$ ssh-copy-id -i ~/.ssh/id_rsa.pub hadoop_tp2@hadoop-slave-2
$ ssh-copy-id -i ~/.ssh/id_rsa.pub hadoop_tp3@hadoop-slave-3
$ chmod 0600 ~/.ssh/authorized_keys 
$ exit

for more information on SSH go to [https://www.ssh.com/ssh/][1]

Step 7: In master, server download and install Hadoop.

# mkdir /opt/hadoop 
# cd /opt/hadoop/ 
# wget http://apache.mesi.com.ar/hadoop/common/hadoop-1.2.1/hadoop-
  1.2.0.tar.gz 
# tar -xzf hadoop-1.2.0.tar.gz 
# mv hadoop-1.2.0 hadoop
# chown -R hadoop /opt/hadoop 
# cd /opt/hadoop/hadoop/

Installation is finished here!

Next step is: Configuring Hadoop

Step 1: Open core-site.xml and edit it as below :

<configuration>
<property> 
  <name>fs.default.name</name> 
  <value>hdfs://hadoop-master:9000/</value> 
</property> 
<property> 
  <name>dfs.permissions</name> 
  <value>false</value> 
</property> 
</configuration>

Step 2: open HDFS-site.xml and edit it as below :

<configuration>
<property> 
  <name>dfs.data.dir</name> 
  <value>/opt/hadoop/hadoop/dfs/name/data</value> 
  <final>true</final> 
</property> 

<property> 
  <name>dfs.name.dir</name> 
  <value>/opt/hadoop/hadoop/dfs/name</value> 
  <final>true</final> 
</property> 
 <property> 
  <name>dfs.name.dir</name> 
  <value>/opt/hadoop/hadoop/dfs/name</value> 
  <final>true</final> 
</property> 

<property> 
  <name>dfs.replication</name> 
  <value>3</value> 
</property> 
</configuration>

Step 3: open MapRed-site.xml and edit --

<configuration>
<property> 
  <name>mapred.job.tracker</name> 
  <value>hadoop-master:9001</value> 
</property> 
</configuration>

Step 4: Append below text in Hadoop-env.sh

export JAVA_HOME=/opt/jdk1.7.0_17 export 
HADOOP_OPTS=Djava.net.preferIPv4Stack=true export 
HADOOP_CONF_DIR=/opt/hadoop/hadoop/conf

Step 5: Configure master --

$ vi etc/hadoop/masters 
hadoop-master

Step 5: Install it on slave nodes as well --

# su hadoop 
$ cd /opt/hadoop 
$ scp -r hadoop hadoop-slave-1:/opt/hadoop 
$ scp -r hadoop hadoop-slave-2:/opt/hadoop
$ scp -r hadoop hadoop-slave-3:/opt/hadoop

Step 6: Configure slaves --

$ vi etc/hadoop/slaves
hadoop-slave-1 
hadoop-slave-2
hadoop-slave-3

Step 7: format the nodes (ONLY ONE TIME OTHERWISE ALL THE DATA WILL BE LOST PERMANENTLY)

# su hadoop 
$ cd /opt/hadoop/hadoop 
$ bin/hadoop namenode –format

You are all set!!

You can start the services as follows --

$ cd $HADOOP_HOME/sbin
$ start-all.sh

This should help you sucessfully set the Hadoop Multi-Node Cluster environment.

answered Jun 4, 2019 by ravikiran
• 4,620 points

Related Questions In Big Data Hadoop

0 votes
1 answer

How to install and configure a multi-node Hadoop cluster?

I would recommend you to install Cent ...READ MORE

answered Mar 22, 2018 in Big Data Hadoop by Shubham
• 13,490 points
2,419 views
0 votes
1 answer

Help m setting up a multi node hadoop cluster

I can help you on this one. Requirement: ...READ MORE

answered Jun 20, 2019 in Big Data Hadoop by ravikiran
• 4,620 points
819 views
0 votes
1 answer

Explain to me the Elasticsearch and Hadoop in a much better manner

I understand your problem, I suggest you download ...READ MORE

answered May 10, 2019 in Big Data Hadoop by ravikiran
• 4,620 points
651 views
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
11,028 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
2,536 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
108,832 views
–1 vote
1 answer

Hadoop dfs -ls command?

In your case there is no difference ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by kurt_cobain
• 9,350 points
4,612 views
0 votes
1 answer

Can anyone help me out to install the following packages R-MR, R-HDFS, and R-HBase on R-HAdoop?

I have understood your problem, I will ...READ MORE

answered May 31, 2019 in Big Data Hadoop by ravikiran
• 4,620 points
649 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP