How to store files in executor s working directory

0 votes
I have a Spark application running and there are executors to execute the task. I have some files that I want to store in the executor's working directory. How can I do this?
Mar 28, 2019 in Apache Spark by Simran
4,208 views

1 answer to this question.

0 votes

You have to specify a comma-separated list of the files you want to store in the executor's working directory. To do this, open the spark shell and run the following command:

val sc = new SparkContext(new SparkConf())

./bin/spark-submit <all your existing options> --spark.yarn.dist.files=<list of files>
answered Mar 28, 2019 by Raj

Related Questions In Apache Spark

0 votes
1 answer

How to set executors for static allocation in Spark Yarn?

Open Spark shell and run the following ...READ MORE

answered Mar 28, 2019 in Apache Spark by Raj
1,494 views
0 votes
1 answer

How do I connect to a HIVE Meta store through a program in SparkSQL?

In spark 2.0.+ it should look something ...READ MORE

answered Sep 5, 2019 in Apache Spark by ravikiran
• 4,620 points
4,389 views
0 votes
1 answer

How to unzip a folder to individual files in HDFS?

Hi, @Amey, You can go through this regarding ...READ MORE

answered May 26, 2020 in Apache Spark by Gitika
• 65,770 points
2,922 views
0 votes
1 answer

How to get the number of elements in partition?

rdd.mapPartitions(iter => Array(iter.size).iterator, true) This command will ...READ MORE

answered May 8, 2018 in Apache Spark by kurt_cobain
• 9,350 points
2,211 views
+1 vote
2 answers
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
11,028 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
2,535 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
108,830 views
0 votes
1 answer

How to automatically kill executors on blacklisting?

You can set the property to directly ...READ MORE

answered Mar 12, 2019 in Apache Spark by Veer
1,184 views
0 votes
1 answer

How to enable dynamic resource allocation in Spark?

To dynamically enable dynamic resource allocation, you ...READ MORE

answered Mar 12, 2019 in Apache Spark by veer
1,729 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP