Home
Recent Q&A
Java
Cloud
JavaScript
Python
SQL
PHP
HTML
C++
Data Science
DBMS
Devops
Hadoop
Machine Learning
Azure
Blockchain
Devops
Ask a Question
What is meant by RDD in Spark?
Home
Spark Sql
What is meant by RDD in Spark?
0
votes
asked
Mar 14, 2020
in
Spark Sql
by
rajeshsharma
What is meant by RDD in Spark?
#spark-rdd-meant
Please
log in
or
register
to answer this question.
1
Answer
0
votes
answered
Mar 14, 2020
by
SakshiSharma
Resilient Distribution Datasets (RDD) is an operational element fault-tolerant collection which satisfies the properties like distributed, catchable, immutable, etc..,
There are two types of RDD:
Hadoop datasets
Parallelized Collections
Related questions
0
votes
Q: What is meant by Parquet file in Spark?
asked
Mar 14, 2020
in
Spark Sql
by
rajeshsharma
#spark-parquet-file
0
votes
Q: What is meant by “Parquet fie” in Spark?
asked
Mar 14, 2020
in
Spark Sql
by
rajeshsharma
#spark-parquet
0
votes
Q: What are the features of Spark RDD?
asked
Jun 8, 2020
in
Spark Sql
by
DavidAnderson
#sparksql
0
votes
Q: What did operations support for RDD in Spark?
asked
Mar 14, 2020
in
Spark Sql
by
rajeshsharma
#spark-operation-support
+1
vote
Q: How to convert RDD to DataFrame in Spark?
asked
Mar 9, 2020
in
Spark Sql
by
SakshiSharma
#spark-convert-dtaframe
0
votes
Q: Difference between RDD and DataFrame in Spark?
asked
Mar 9, 2020
in
Spark Sql
by
SakshiSharma
#spark-rdd
spark-dataframe
0
votes
Q: RDD how It’s work in Spark?
asked
Mar 9, 2020
in
Spark Sql
by
SakshiSharma
#rdd-spark
0
votes
Q: What is RDD in Spark?
asked
Mar 8, 2020
in
Spark Sql
by
rahuljain1
#spark-rdd-system
0
votes
Q: What are the three ways to create RDD in Spark?
asked
Mar 7, 2020
in
Spark Sql
by
rahuljain1
#rdd-spark
+1
vote
Q: By default Spark uses which algorithm to remove old and unused RDD to release more memory.
asked
Sep 16, 2022
in
Spark Preliminaries
by
sharadyadav1986
spark
rdd
spark-preliminaries
...