0 votes
in PySpark by
How can you implement machine learning in Spark?

1 Answer

0 votes
by

We can implement machine learning in Spark by using MLlib. Spark provides a scalable machine learning record called MLlib. It is mainly used to create machine learning scalable and straightforward with ordinary learning algorithms and use cases like clustering, weakening filtering, dimensional lessening, etc.

Related questions

0 votes
asked Aug 25, 2022 in Apache Spark by sharadyadav1986
0 votes
asked Mar 13, 2022 in PySpark by rajeshsharma
...