W
W
web_dev2019-02-20 17:41:07
Apache Spark
web_dev, 2019-02-20 17:41:07

Spark Master as Driver in Kubernetes Cluster?

Hello, tell me please, who manages to launch Spark in Kubernetes?
As a matter of fact, until I googled the solutions, maybe someone will tell you something ..
I would like to have a driver in Kubernetes with Spark-Master.
From which spark-submit would be launched and the job would be monitored accordingly and would be immediately visible in the Web UI and then in the History Server.
Those who launched Spark in Kubernetes (spark submit on k8s) know that Kubernetes automatically creates a driver and executors. The driver ceases to be in the Running status after the execution of the executors and, accordingly, the WEB UI is no longer available.
I know that it is possible to install Spark as Standalone Cluster: Master(Pod) + Workers(Pod). But this is a slightly different workflow.
Is that possible? Who has what solutions with Kubernetes?
Thanks for the ideas and advice.

Answer the question

In order to leave comments, you need to log in

1 answer(s)
S
sergeysamsonov, 2020-05-20
@sergeysamsonov

Good afternoon!
I would suggest using the Kubernetes Operator .
This pattern involves managing the lifecycle of a Spark application directly on a K8s cluster.
You can read more about our experience in the article .

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question