Can number of Spark task be greater than the executor core

0 votes
What happens when the number of spark tasks is greater than the executor core? How is this scenario handled by Spark?
Jun 17, 2020 in Apache Spark by Rishi
• 160 points

edited Jun 17, 2020 by MD 2,019 views

1 answer to this question.

0 votes

Hi@Rishi,

Yes, number of spark tasks can be greater than the executor no. But at that situation, extra task thread is just sitting there in the TIMED_WAITING state.  Each task needs one executor core. When one executor finishes its task, another task is automatically assigned. You can increase your executor no. But it depends on your available memory.  

To know more about Pyspark, it's recommended that you join Pyspark Certification today.

answered Jun 17, 2020 by MD
• 95,460 points

Related Questions In Apache Spark

0 votes
1 answer
0 votes
1 answer

7)From Schema RDD, data can be cache by which one of the given choices?

Hi, @Ritu, According to the official documentation of Spark 1.2, ...READ MORE

answered Nov 23, 2020 in Apache Spark by Gitika
• 65,770 points
1,952 views
0 votes
1 answer

What are some of the things you can monitor in the Spark Web UI?

Option c) Mapr Jobs that are submitted READ MORE

answered Nov 25, 2020 in Apache Spark by Gitika
• 65,770 points
3,758 views
+1 vote
2 answers
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
11,015 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
2,528 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
108,739 views
0 votes
1 answer

Can the executor core be greater than the total number of spark tasks?

Hi@Rishi, Yes, it is possible. If executor no. ...READ MORE

answered Jun 17, 2020 in Apache Spark by MD
• 95,460 points
2,193 views
0 votes
1 answer

The number of stages in a job is equal to the number of RDDs in DAG. however, under one of the cgiven conditions, the scheduler can truncate the lineage. identify it.

Hi@Edureka, Spark's internal scheduler may truncate the lineage of the RDD graph ...READ MORE

answered Nov 26, 2020 in Apache Spark by MD
• 95,460 points
4,006 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP