Home
Recent Q&A
Java
Cloud
JavaScript
Python
SQL
PHP
HTML
C++
Data Science
DBMS
Devops
Hadoop
Machine Learning
Azure
Blockchain
Devops
Ask a Question
What is the use of Akka in PySpark?
Home
PySpark
What is the use of Akka in PySpark?
0
votes
asked
Mar 13, 2022
in
PySpark
by
rajeshsharma
What is the use of Akka in PySpark?
pyspark
Please
log in
or
register
to answer this question.
1
Answer
0
votes
answered
Mar 13, 2022
by
rajeshsharma
Akka is used in PySpark for scheduling. When a worker requests a task to the master after registering, the master assigns a task to him. In this case, Akka sends and receives messages between the workers and masters.
Your comment on this answer:
Email me at this address if a comment is added after mine:
Email me if a comment is added after mine
Privacy: Your email address will only be used for sending these notifications.
Related questions
0
votes
Q: What is PySpark? / What do you know about PySpark?
asked
Mar 13, 2022
in
PySpark
by
rajeshsharma
pyspark
0
votes
Q: What is DStream in PySpark?
asked
Mar 13, 2022
in
PySpark
by
rajeshsharma
pyspark-dstream
0
votes
Q: Can we create PySpark DataFrame from external data sources?
asked
Mar 13, 2022
in
PySpark
by
rajeshsharma
pyspark-data-frame
0
votes
Q: What do you understand by startsWith() and endsWith() methods in PySpark?
asked
Mar 13, 2022
in
PySpark
by
rajeshsharma
spark-endswith
spark-startswith
0
votes
Q: What do you understand by PySpark SparkStageinfo?
asked
Mar 13, 2022
in
PySpark
by
rajeshsharma
pysparksparkstageinfo
0
votes
Q: What is PySpark SparkJobinfo?
asked
Mar 13, 2022
in
PySpark
by
rajeshsharma
sparkjobinfo
0
votes
Q: What do you understand by custom profilers in PySpark?
asked
Mar 13, 2022
in
PySpark
by
rajeshsharma
spark-profile
0
votes
Q: What are the key advantages of PySpark RDD?
asked
Mar 13, 2022
in
PySpark
by
rajeshsharma
pyspark-rdd
0
votes
Q: What do you understand by SparkSession in Pyspark?
asked
Mar 13, 2022
in
PySpark
by
rajeshsharma
sparksession
0
votes
Q: Why is PySpark faster than pandas?
asked
Mar 13, 2022
in
PySpark
by
rajeshsharma
pyspark-padas
...