Cache vs persist in Spark

0 votes
Can someone explain the difference between Spark cache() and persist()? They both look the same to me
Mar 8, 2019 in Apache Spark by Esha
11,066 views

1 answer to this question.

0 votes

The cache() is used only the default storage level MEMORY_ONLY. But with persist(), you can specify which storage level you want. So cache() is the same as calling persist() with the default storage level. The default persist() will store the data in the JVM heap as unserialized objects. When you write data to a disk, that data is also always serialized.

answered Mar 8, 2019 by Raj

Related Questions In Apache Spark

0 votes
1 answer

What is the difference between persist() and cache() in apache spark?

Using cash technique we can save intermediate ...READ MORE

answered Dec 27, 2022 in Apache Spark by Deepthi

edited Mar 5 3,833 views
0 votes
1 answer

cache tables in apache spark sql

Caching the tables puts the whole table ...READ MORE

answered May 4, 2018 in Apache Spark by Data_Nerd
• 2,390 points
3,500 views
0 votes
1 answer

How RDD persist the data in Spark?

There are two methods to persist the ...READ MORE

answered Jun 18, 2018 in Apache Spark by nitinrawat895
• 11,380 points
1,505 views
+1 vote
1 answer

map vs mapValues in Spark

There is a difference between the two: mapValues ...READ MORE

answered Jun 29, 2018 in Apache Spark by nitinrawat895
• 11,380 points
16,323 views
+1 vote
2 answers
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
11,330 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
2,787 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
110,625 views
+1 vote
3 answers

map() vs flatMap() in Spark

Spark map function expresses a one-to-one transformation. ...READ MORE

answered Jun 17, 2019 in Apache Spark by vishal
• 180 points
39,027 views
0 votes
1 answer

Components of Spark

Spark core: The base engine that offers ...READ MORE

answered Mar 8, 2019 in Apache Spark by Raj
832 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP