64133/can-we-store-hive-output-in-hadoop-hdfs
Hi,
You can save your hive output to Hdfs. First start your hadoop cluster and then use the below command.
INSERT OVERWRITE DIRECTORY "set your HDFS Path " ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t' SELECT * FROM table LIMIT 10;
Thank You
Yes, it is possible to use HDFS ...READ MORE
No, there is no other option to ...READ MORE
Try this and see if it works: public ...READ MORE
You can get the column names by ...READ MORE
You can use a combination of cat and put command. Something ...READ MORE
Open spark-shell. scala> import org.apache.spark.sql.hive._ scala> val hc = ...READ MORE
Hey, The following examples are simple, common data ...READ MORE
You will have to exclusively mention the ...READ MORE
Hi@Rajan, If you are asking, can we configure hadoop ...READ MORE
Hi@akhtar, It depends on what you set in ...READ MORE
OR
At least 1 upper-case and 1 lower-case letter
Minimum 8 characters and Maximum 50 characters
Already have an account? Sign in.