W
W
web_dev2018-07-02 11:10:31
Java
web_dev, 2018-07-02 11:10:31

Apache Spark on Kubernetes - deploy process. How do you guys deploy?

Hello, in the latest versions of Apache Spark and Kubernetes, it became possible to deploy Spark applications using sprak-submit to Kubernetes-master. Therefore, I have a question.
How are you deploying, gentlemen? What does the deployment process look like for you, thoughts, ideas?
So far, I see it in the following way. For example.
Jenkins:
1. Build and build a Spark project.
2. Creates a docker image with Spark Shell inside, puts our application in a container.
3. Push image into local image registry.
4. kubectl - redeploy spark deployment. Accordingly, at this step, a spark-submit occurs from the container to the Kubernetes master.
All steps are taken by Jenkins, which is also in the cluster. Of course, there are many subtleties. But if in general, then something like this ..
I would be grateful for any thoughts, comments, tips!

Answer the question

In order to leave comments, you need to log in

2 answer(s)
�
âš¡ Kotobotov âš¡, 2018-07-11
@angrySCV

fabric8.io
and no worries about building pipelines

S
sergeysamsonov, 2020-05-20
@sergeysamsonov

Good afternoon! Kubernetes Operator
was used to manage Spark applications on K8s . Of the benefits - a description of the application configuration in YAML, the ability to deploy using Helm. In short, our pipeline looked like this: 1) building the application; 2) building and publishing a Docker image with a Spark application JAR file; 3) preparation of CRD SparkApplication specification; 4) deploy CRD SparkApplication using Helm. You can read more about our experience in the article . Ready to answer questions. You can also write to Telegram @samsonov90

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question