0 votes
in PySpark by
How can you implement machine learning in Spark?

1 Answer

0 votes
by

We can implement machine learning in Spark by using MLlib. Spark provides a scalable machine learning record called MLlib. It is mainly used to create machine learning scalable and straightforward with ordinary learning algorithms and use cases like clustering, weakening filtering, dimensional lessening, etc.

...