Scala join comma delimited file as tables

0 votes
I have two data set both are csv format and I want to join both the tables using comma delimited. Can u give any examples for the above scenario?
Jul 9, 2019 in Apache Spark by Lohit
1,040 views

1 answer to this question.

0 votes

Dataframe creation commands:​

image

Now we will register them as tempTables and will do sparksql:

image

imageFor loading file in comma delimited csv to a dataframe, please refer to the below command:

val df1 = sqlContext.read.format("csv").option("header", "true").option("delimiter", ",").option("inferschema", "false").load("stud.csv");
answered Jul 9, 2019 by Suraj

Related Questions In Apache Spark

0 votes
1 answer

where can i get spark-terasort.jar and not .scala file, to do spark terasort in windows.

Hi! I found 2 links on github where ...READ MORE

answered Feb 13, 2019 in Apache Spark by Omkar
• 69,220 points
1,426 views
0 votes
1 answer

Scala pass input data as arguments

Please refer to the below code as ...READ MORE

answered Jun 19, 2019 in Apache Spark by Lisa
2,698 views
0 votes
1 answer

How to create RDD from an external file source in scala?

Hi, To create an RDD from external file ...READ MORE

answered Jul 4, 2019 in Apache Spark by Gitika
• 65,770 points
1,870 views
0 votes
0 answers

How to create RDD as string file?

Can anyone suggest how to create RDD ...READ MORE

Jul 5, 2019 in Apache Spark by anand
1,076 views
+1 vote
2 answers
0 votes
1 answer

How to find the number of null contain in dataframe?

Hey there! You can use the select method of the ...READ MORE

answered May 3, 2019 in Apache Spark by Omkar
• 69,220 points
5,255 views
+2 votes
4 answers

use length function in substring in spark

You can use the function expr val data ...READ MORE

answered May 3, 2018 in Apache Spark by kurt_cobain
• 9,350 points
43,445 views
0 votes
3 answers

How to connect Spark to a remote Hive server?

JDBC is not required here. Create a hive ...READ MORE

answered Mar 8, 2019 in Big Data Hadoop by Vijay Dixon
• 190 points
13,056 views
0 votes
1 answer

How to create dataframe for the comma delimited file?

.option("sep", delimeter) READ MORE

answered Oct 28, 2022 in Apache Spark by anonymous

edited Mar 5 3,680 views
0 votes
1 answer

Spark, Scala: Load custom delimited file

You can load a DAT file into ...READ MORE

answered Jul 16, 2019 in Apache Spark by Shri
9,871 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP