How can I delete a dataflow job using gcloud? - google-cloud-dataflow

I am attempting to delete a Google Cloud Dataflow job using gcloud and its failing for a reason I don't understand:
# gcloud dataflow --project=XXXX jobs cancel --region=europe-west2 bhp-dp-pubsub-to-datalake
Failed to cancel job [bhp-dp-pubsub-to-datalake]: (fe9655fb12e69cb6): Could not cancel
workflow; user does not have sufficient permissions on project: XXX, or the job does not
exist in the project. Please ensure you have permission to access the job and the `--
region` flag, europe-west2, matches the job's region.
I know that I have permission to cancel jobs because I can do it from the UI.
Anyone any idea what might be wrong here?

Related

Pass User-Scoped Credentials in Downstream Job in Jenkins is giving error

I am trying to pass user-scoped credentials to a downstream job in Jenkins in the declerative pipeline in order to be used from the downstream job for AWS Authentication. I have checked the option "Run as User who triggered the buid" in jenkins settings. When I trigger the Job it is working but when I try to trigger it from another job the it is giving me an error like "Error: " and after that is giving the credentials ID. Which means that the credentials are pass to the job but fore some reason the cannot be used.
I use the credentials like this: environment { creds = credentials("${AWSCredentials}") } in a stage of the declerative pipeline and it is failing right there. My goal is to make all the job to run with each user's personalized credentials and not to use Global credentials to Access and Modify AWS Resources through those jobs.

Establish relationship between two Jenkins Jobs available on different Jenkins server

I am building Jenkins for Test / QA automation scripts, lets name it TEST_JOB. For application, I have application source code Jenkins build, name it DEV_JOB.
My scenario is when DEV_JOB completes execution (successfully), execute TEST_JOB immediately. I am aware about setting up project upstream / downstream [ Build after other projects are built ] to accomplish this task. But here, Problem is DEV_JOB is on different server than TEST_JOB. Due to which, TEST_JOB fails to recognize DEV_JOB.
Now, how would I achieve this scenario?
You can use Jenkins API for remote trigger of Job.
Say you have job on DEV_JOB on JENKINS_1, add a penultimate step(or upstream/downstream project having only this step) which invokes TEST_JOB using remote API call of JENKINS_2 server.
Example command would be
$(curl --user "username:password" "http://JENKINS_2/job/TEST_JOB/buildWithParameters?SOMEPARAMETER=$SOMEPARAMETER")
username:password is a valid user on JENKINS_2.
Avoid using your own account here but rather a 'build trigger' account that only has permissions to start those jobs.

Google Dataflow jobs stuck analysing the graph

We have submitted a couple of jobs that seem to have stuck on the graph analyzing step.
A weird error appears on top of the Google Dataflow jobs list page:
A job with ID "2018-01-19_03_27_48-15138951989738918594" doesn't exist
Also, trying to list it using gcloud tool shows them as in Unknown state:
JOB_ID NAME TYPE CREATION_TIME STATE REGION
2018-01-19_03_27_48-15138951989738918594 myjobname2 Streaming 2018-01-19 12:27:48 Unknown europe-west1
2018-01-19_03_21_05-1065108814086334743 myjobname Streaming 2018-01-19 12:21:06 Unknown europe-west1
Trying to cancel them using gcloud tool as well doesn't work either:
$ gcloud beta dataflow jobs --project=myproject cancel 2018-01-19_03_21_05-1065108814086334743
Failed to cancel job [2018-01-19_03_21_05-1065108814086334743]: (9027838c1500ddff): Could not cancel workflow; user does not have sufficient permissions on project: myproject, or the job does not exist in the project.
Any idea?

Force killing a Cloud Dataflow job

When trying to enter the job info from GC web UI, it never opens but this kind of message is presented in the main jobs listing:
A job with ID "2017-10-17_23_35_15-3310306527853724439" doesn't exist
gcloud dataflow jobs cancel returns this kind of message (repeatedly):
Failed to cancel job [2017-10-17_23_35_15-3310306527853724439]:
(882c3a8a1f6e0d10): Workflow modification failed. Causes:
(19752e1d053cad56): Operation cancel not allowed for job 2017-10-
17_23_35_15-3310306527853724439. Job is not yet ready for canceling.
Please retry in a few minutes.
Updating or deploying a new job with the same name doesn't work. So how to force kill the job?

Jenkins CI - Allowing a job to run (triggered manually) only if a specific list of jobs are "blue" (successful)

Is there a way to achieve that? getting the status of other jobs and checking if last run was "success"?
I don't want to automatically run this deployment job on the upstream success, but to trigger this manually. but still safeguard by checking upstream (multiple) success
Thanks for the help

Resources