Categories

Jan 13 in Big Data | Hadoop
Q: What are the two ways to create RDD in Spark?

1 Answer

Jan 13

We can create RDD in Spark in following two ways:

1. Internal: We can parallelize an existing collection of data within our Spark Driver program and create a RDD out of it.

2. External: We can also create RDD by referencing a Dataset in an external data

 

source like AWS S3, HDFS, HBASE etc.

 

Click here to read more about Loan/Mortgage
Click here to read more about Insurance

Related questions

Madanswer
Jun 8 in Spark Sql
May 15 in MVC Language
Jun 13
...