how set airflow DAG Concept scheduling - task

I'm trying some airflow DAG Schedule.
I scheduled like below code.
Task1 >> [Task2, Task3] >> Task4
Then, i expected running Task4 once, when finished task2 and task3.
but.. i think... task4 ran twice.
(task1 -> task2 -> task4) and (task1 -> task3 -> task4)
reason is.. i saw airflow DAG tree view.
How to set running task4 only once?

The Tree View in the Airflow UI shows all distinct branches from root to leaf in the DAG. Based on the screenshot you provided, there are 2 branches:
print_date >> sleep >> print_date_1
print_date >> templated >> print_date_1
This does not mean that the print_date_1 task ran twice. To see the actual DAG check out the Graph View (just to the right of the Tree View button). You should see that each task is present only once.
You may find this guide helpful to understand the Airflow UI.

Related

Jenkins Orchestrate the set of some job in CI/CD

Dependency is :
When Job A finished it will start Job B and Job C then Job C starts Job D And when Job D and Job B finishes then only We need to start the Job E.
Please suggest how I can achieve this.
When Job A finished it will start Job B and Job C: Job B and Job C can be triggered just as Post build step by A
Then Job C starts Job D: Same thing, D triggered as post build step by C.
When Job D and Job B finishes then only We need to start the Job E: than i would: you can trigger E from D or B, or from both of them as build step, and then just use in E the parameter "Block build if certain jobs are running", and block set D and B as condition. This way E will be triggered by D and B but it will wait till the time no one of them is running to start running his task. There are other way but i think that this one is the easiest.
Let me know if it helps...
You could use the Parameterized Trigger Plugin which allow you to do just that. For example, after installing the plugin, in Job A you would have the possibility to Add build step which will allow you to trigger another job(s).
Join plugin along with build pipeline plugin :)

List View of Pipeline views in Jenkins

In Jenkins I can create a "List View" and sort all the jobs you want in that list. Can a similar list view be created where I can sort out all my "Build pipeline Views" based on different catagories?
My pipelines:
pipeline1 : Job A -> Job B -> Job C
pipeline2: Job D -> Job E -> Job F
pipeline3: Job G -> Job H -> Job I
pipeline4: Job J -> Job K -> Job L
My list view should list the pipeline like so:
ListView1: pipeline1, pipeline3
ListView2: pipeline2, pipeline4
Is there any plugin which can help me with this or any other alternative way to do this?
Answer
You can use Jenkins Build Pipeline Plugin which will be a best fit for your scenario.
You can sort these by only configuring in Jenkins based on the order which you have asked in the question, as Jenkins thread will execute based on your Job configuration in Admin UI Screen.

Sidekiq worker processes not updating database records properly

I have a sidekiq worker that processes certain series of task in batch. Once it completes the job, it updates a tracker table on the success/failure of the task. Each batch has a unique identifier that is being passed to the worker script and the worker process queries that table for this unique id and update that particular row through a activerecord query similar to:
cpr = MODEL.find(tracker_unique_id)
cpr.update_attributes(:attempted => cpr[:attempted] + 1, :success => cpr[:success] + 1)
What I have noticed is that the tracker only get record of 1 set of task running even though I can see from the sidekiq log and another result table that x number of tasks finished running.
Anyone can help me on this?
Your update_attributes call has a race condition as you cannot increment like that safely. Multiple threads will stomp on each other. You must do a proper UPDATE SQL statement.
update models set attempted = attempted + 1 where tracker_unique_id = ?

Jenkins plugin for sequential check

I have a below scenario
Parent Job triggers Job B and Job B triggers Job C and Job C triggers D in sequence irrespective of whether the child jobs (B,C,D) are failure or success.
What I want to achive is only after the Job B,Job C,Job D is success and JOB E should be triggered .If by chance any of the child jobs (B,C,D) failes then the final Job E should not be triggered.
How shall I go about this ?Any plugin is there?
Select Trigger only if build is stable.
You may also be interested in using the Workflow plugin (as in your jenkins-workflow tag, perhaps accidental) to orchestrate the whole system programmatically:
build 'B'
build 'C'
build 'D'
build 'E'

Hudson + Running parallel jobs

I would like to configure a project in Hudson as shown below.
The starting Job is Job-A. When this job is finished it has to trigger three other jobs, B, C and D together. These three jobs may take different times to complete. Once the jobs B, C and D are finished it has to trigger another job E.
I have seen options like, Pipe line plugin, parameterized plugin etc.These are working fine for the first stage. ie, it will trigger build B, C and D together when job A is completed. But I am stuck at configuring the JOB E in such a way that, it has to start only when all the jobs, B, C & D are finished.
Please assist. Thanks in advance.
Use the Join Plugin, that will allow you to start B, C, and D after A is finished, then trigger E when they are successfully done.
Use simple DSL Scripts
Example:
parallel
(
{build("job1")}
{build("job2")}
{build("job3")}
)
build("job4")
here 3 jobs running in parallel phase.
4th job get excuted only after the completion of parallel jobs.

Resources