which version of sqoop should i use for hadoop 3 3 0

0 votes

I am trying to install sqoop 1.4.7 in windows 10 on hadoop 3.3.0 ,

on using ./configure-sqoop on GIT bash I get following o/p:

Warning: C:\sqoop_data\sqoop-1.4.7.bin__hadoop-2.6.0/../hbase does not exist! HBase imports will fail. Please set $HBASE_HOME to the root of your HBase installation. Warning: C:\sqoop_data\sqoop-1.4.7.bin__hadoop-2.6.0/../hcatalog does not exist! HCatalog jobs will fail. Please set $HCAT_HOME to the root of your HCatalog installation. Warning: C:\sqoop_data\sqoop-1.4.7.bin__hadoop-2.6.0/../accumulo does not exist! Accumulo imports will fail. Please set $ACCUMULO_HOME to the root of your Accumulo installation. Warning: C:\sqoop_data\sqoop-1.4.7.bin__hadoop-2.6.0/../zookeeper does not exist! Accumulo imports will fail. Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.

on verifying the installation using sqoop.cmd version , I get:

Warning: HBASE_HOME and HBASE_VERSION not set. Warning: HCAT_HOME not set Warning: HCATALOG_HOME does not exist HCatalog imports will fail. Please set HCATALOG_HOME to the root of your HCatalog installation. Warning: ACCUMULO_HOME not set. Warning: ZOOKEEPER_HOME not set. Warning: HBASE_HOME does not exist HBase imports will fail. Please set HBASE_HOME to the root of your HBase installation. Warning: ACCUMULO_HOME does not exist Accumulo imports will fail. Please set ACCUMULO_HOME to the root of your Accumulo installation. Warning: ZOOKEEPER_HOME does not exist Accumulo imports will fail. Please set ZOOKEEPER_HOME to the root of your Zookeeper installation. The system cannot find the path specified.

Please help with a solution to this problem

Sep 7, 2020 in Big Data Hadoop by shresht
• 140 points

reshown Sep 7, 2020 by Gitika 2,283 views
Did you get SQOOP 1.4.7 to work with Hadoop 3.X?

Hi@Ed,

Are you getting any errors with the mentioned combination? If yes, then share your error. We will help you as soon as possible.

1 answer to this question.

0 votes

Hi@shresht,

This is just a warning. Did you get any error after these warnings? Also, share your .bashrc file with us. But before that please check all the steps from the below link.

https://www.edureka.co/community/39201/sqoop-installation-on-linux

answered Sep 7, 2020 by MD
• 95,460 points
no errors but , sqoop -version should give me version of sqoop that i am using and also i am installing it in windows 10 machine using git bash , the link you shared is of installation in ubuntu
Ok. Have you set an environment variable in your Windows system for SQOOP? Also, check your Hadoop Cluster is working or not.

Related Questions In Big Data Hadoop

+1 vote
1 answer

which version of sqoop i install for hadoop3.1.1 ?

Hi @lucky! The latest version of Sqoop available ...READ MORE

answered Feb 7, 2019 in Big Data Hadoop by Omkar
• 69,220 points
5,601 views
0 votes
1 answer

How can I download hadoop documentation for a specific version?

You can go through this SVN link:- ...READ MORE

answered Mar 22, 2018 in Big Data Hadoop by Shubham
• 13,490 points
878 views
0 votes
1 answer

Sqoop version compatible with Hadoop 0.20?

Apache Sqoop has 2 major projects i.e. ...READ MORE

answered May 9, 2018 in Big Data Hadoop by Shubham
• 13,490 points
1,007 views
0 votes
1 answer
0 votes
0 answers

which version of sqoop should i use with hadoop 3.3.0 ?

I am trying to install sqoop 1.4.7 ...READ MORE

Sep 6, 2020 in Big Data Hadoop by shresht
• 140 points

closed Sep 7, 2020 by Gitika 1,150 views
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
11,015 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
2,528 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
108,739 views
0 votes
1 answer

I have to ingest in hadoop cluster large number of files for testing , what is the best way to do it?

Hi@sonali, It depends on what kind of testing ...READ MORE

answered Jul 8, 2020 in Big Data Hadoop by MD
• 95,460 points
1,205 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP