0 votes
in Spark Sql by
What are the different storage/persistence levels in Apache Spark in Spark?

1 Answer

0 votes
by
Apache Spark can persist the data from different shuffle operations. It is always suggested that call RDD call persist method() and it is only when they reuse it. There are various levels of persistence in Spark for storing RDDs in memory or disk or can store in a combination of both.

Levels of storage/persistence in Apache Spark are:

MEMORY_ONLY_SER

MEMORY_AND_DISK_SER, DISK_ONLY

MEMORY_ONLY

MEMORY_AND_DISK

OFF_HEAP

Related questions

0 votes
asked Mar 8, 2020 in Spark Sql by rahuljain1
0 votes
asked Mar 14, 2020 in Spark Sql by rajeshsharma
...