51173/is-it-mandatory-to-start-hadoop-to-run-spark-application
No, it is not mandatory, but there ...READ MORE
Yes, it is possible to run Spark ...READ MORE
from pyspark.sql.types import FloatType fname = [1.0,2.4,3.6,4.2,45.4] df=spark.createDataFrame(fname, ...READ MORE
Hi@Edureka, Checkpointing is a process of truncating RDD ...READ MORE
Instead of spliting on '\n'. You should ...READ MORE
Firstly you need to understand the concept ...READ MORE
org.apache.hadoop.mapred is the Old API org.apache.hadoop.mapreduce is the ...READ MORE
Hi, You can create one directory in HDFS ...READ MORE
Hi, These are the steps to run spark in ...READ MORE
Hi, The yield keyword is used because the ...READ MORE
OR
At least 1 upper-case and 1 lower-case letter
Minimum 8 characters and Maximum 50 characters
Already have an account? Sign in.