N
N
Nikolay Baranenko2022-01-15 10:19:00
Continuous Delivery
Nikolay Baranenko, 2022-01-15 10:19:00

GitlabCI how to make a dependent job?

Consider the case

, there are 2 separate files that solve the problem with integration testing,

the first one without using kafka, the second one using,

include:
# PRODUCT
  - project: 'gitlabci/integration-test'
    ref: dev_v2
    file: 
      - 'spark/.base_integration_test.yml'
      - 'spark/.base_integration_test_with_kafka.yml'


I would like to make a choice of one or another testing scenario during the execution of the pipeline
in the preliminary stage before starting integration testing,
or

.base_integration_test: 
  variables:
    COVERAGE_SOURCE: "./src"
  extends: .base_integration_test


or

.base_integration_test__with_kafka:
  variables:
    COVERAGE_SOURCE: "./src"
  extends: .base_integration_test_with_kafka


difference in extends:

What's the best way to do it?

Answer the question

In order to leave comments, you need to log in

1 answer(s)
Z
ZIK1337, 2022-01-17
@drno-reg

In the comments, it was discussed
In the rules, all variables are substituted before the pipeline is launched, and only suitable jobs remain in the final pipeline, so the variables calculated in other jobs cannot be used in the rules (in the parent pipeline)
For this, triggers are needed, then the calculated variable will normally be substituted in the child pipeline only the necessary jobs will remain

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question