34261/spark-multiple-version-without-major-version
Is it possible to use multiple spark version without setting the SPARK_MAJOR_VERSION?
Yes. It is not necessary to set SPARK_MAJOR_VERSION. You can type different command to run different spark.
spark-shell
loads Spark 1.6
and
spark2-shell
loads Spark 2.0
No, you can run spark without hadoop. ...READ MORE
You can run the below code to ...READ MORE
Try this: val new_records = sc.newAPIHadoopRDD(hadoopConf,classOf[ ...READ MORE
Yes, one can build “Spark” for a specific ...READ MORE
Instead of spliting on '\n'. You should ...READ MORE
Firstly you need to understand the concept ...READ MORE
org.apache.hadoop.mapred is the Old API org.apache.hadoop.mapreduce is the ...READ MORE
Hi, You can create one directory in HDFS ...READ MORE
You can use the SPARK_MAJOR_VERSION for this. Suppose ...READ MORE
you need both core and SQL artifacts <repositories> ...READ MORE
OR
At least 1 upper-case and 1 lower-case letter
Minimum 8 characters and Maximum 50 characters
Already have an account? Sign in.