Answer the question
In order to leave comments, you need to log in
GitlabCI how to make a dependent job?
Consider the case
, there are 2 separate files that solve the problem with integration testing,
the first one without using kafka, the second one using,
include:
# PRODUCT
- project: 'gitlabci/integration-test'
ref: dev_v2
file:
- 'spark/.base_integration_test.yml'
- 'spark/.base_integration_test_with_kafka.yml'
.base_integration_test:
variables:
COVERAGE_SOURCE: "./src"
extends: .base_integration_test
.base_integration_test__with_kafka:
variables:
COVERAGE_SOURCE: "./src"
extends: .base_integration_test_with_kafka
Answer the question
In order to leave comments, you need to log in
In the comments, it was discussed
In the rules, all variables are substituted before the pipeline is launched, and only suitable jobs remain in the final pipeline, so the variables calculated in other jobs cannot be used in the rules (in the parent pipeline)
For this, triggers are needed, then the calculated variable will normally be substituted in the child pipeline only the necessary jobs will remain
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question