Login
Remember
Register
Ask a Question
What is meant by RDD in Spark?
0
votes
asked
Mar 14, 2020
in
Spark Sql
by
rajeshsharma
What is meant by RDD in Spark?
#spark-rdd-meant
Please
log in
or
register
to answer this question.
1
Answer
0
votes
answered
Mar 14, 2020
by
SakshiSharma
Resilient Distribution Datasets (RDD) is an operational element fault-tolerant collection which satisfies the properties like distributed, catchable, immutable, etc..,
There are two types of RDD:
Hadoop datasets
Parallelized Collections
...