0 votes
in Big Data | Hadoop by
What are the common Transformations in Apache Spark?

1 Answer

0 votes
by

Some of the common transformations in Apache Spark are as follows:

1. Map(func): This is a basic transformation that returns a new dataset by passing each element of input dataset through func function.

 

2. Filter(func): This transformation returns a new dataset of elements that return true for func  function. It is used to filter elements in a dataset based on criteria in func function.

3. Union(other Dataset): This is used to combine a dataset with another dataset to form a union of two datasets.

4. Intersection(other Dataset): This

transformation gives the elements common to two datasets.

 

5. Pipe(command, [envVars]): This transformation passes each partition of the dataset through a shell command.

 

Related questions

+1 vote
asked Mar 10, 2020 in Big Data | Hadoop by Hodge
0 votes
asked Feb 22, 2020 in Big Data | Hadoop by SakshiSharma
...