How to manually schedule job from AWS Lambda? - ruby-on-rails

I want to replace my cron scheduler. Is there a way to schedule an ActiveJob from a Lambda and Cloudwatch? I'm using the Que gem.

You create a schedule for an AWS Lambda to get triggered AWS Cloudwatch Schedule Expressions
AWS Doc -
https://docs.aws.amazon.com/AmazonCloudWatch/latest/events/ScheduledEvents.html#CronExpressions
To manually trigger your AWS Lambda function you can do it either via the AWS Console, SDK, CLI or use API Gateway, with some security to do so.

Related

Is there a way to update a Dataflow job using the gcloud command?

I am trying to write a script to automate the deployment of a Java Dataflow job. The script creates a template and then uses the command
gcloud dataflow jobs run my-job --gcs-location=gs://my_bucket/template
The issue is, I want to update the job if the job already exists and it's running. I can do the update if I run the job via maven, but I need to do this via gcloud so I can have a service account for deployment and another one for running the job. I tried different things (adding --parameters update to the command line), but I always get an error. Is there a way to update a Dataflow job exclusively via gcloud dataflow jobs run?
Referring to the official documentation, which describes gcloud beta dataflow jobs - a group of subcommands for working with Dataflow jobs, there is no possibility to use gcloud for update the job.
As for now, the Apache Beam SDKs provide a way to update an ongoing streaming job on the Dataflow managed service with new pipeline code, you can find more information here. Another way of updating an existing Dataflow job is by using REST API, where you can find Java example.
Additionally, please follow Feature Request regarding recreating job with gcloud.

How to scale down OpenShift/Kubernetes pods automatically on a schedule?

I have a requirement to scale down OpenShift pods at the end of each business day automatically.
How might I schedule this automatically?
OpenShift, like Kubernetes, is an api-driven application. Essentially all application functionality is exposed over the control-plane API running on the master hosts.
You can use any orchestration tool that is capable of making API calls to perform this activity. Information on calling the OpenShift API directly can be found in the official documentation in the REST API Reference Overview section.
Many orchestration tools have plugins that allow you to interact with OpenShift/Kubernetes API more natively than running network calls directly. In the case of Jenkins for example there is the OpensShift Pipeline Jenkins plugin that allows you to perform OpenShift activities directly from Jenkins pipelines. In the cases of Ansible there is the k8s module.
If you were to combine this with Jenkins capability to run jobs on a schedule you have something that meets your requirements.
For something much simpler you could just schedule Ansible or bash scripts on a server via cron to execute the appropriate API commands against the OpenShift API.
Executing these commands from within OpenShift would also be possible via the CronJob object.

Cloud Dataflow to trigger email upon failure

When running Dataflow job in Google cloud, how can the Dataflow pipeline be configured to shoot an email upon failure (or successful completion)? Is there an easy option, where it can be configured inside the Pipeline program, or any other options?
One possible option from my practice is to use Apache Airflow(on GCP, it's Cloud Composer).
Cloud Composor/Apache Airflow can send you failure emails when DAG (jobs) face failures. So I can host my job in Airflow and it sends emails upon failures.
You can check[1] to see if it satisfies your need
https://cloud.google.com/composer/docs/quickstart

Trigger Jenkins job when a S3 file is updated

I'm looking for a way to trigger my Jenkins job whenever a file is created or updated in S3.
I can't seem to find anything by usual means of search. It is always upload artifacts to S3, but rarely download and even then I can't seem to find a way to trigger of the actual update process.
The only way I currently can figure out how to do this at all, would be to sync the file periodically and compare the hash to previous versions, but that is a really terrible solution.
The idea behind this would be to have an agency (which does not have access to our Jenkins) upload their build artifacts and to trigger a deployment from that.
You can use a combination of SNS Notifications for new artifacts in the S3 bucket https://docs.aws.amazon.com/AmazonS3/latest/dev/NotificationHowTo.html and the Jenkins AWS SQS plugin to trigger a build (https://github.com/jenkinsci/aws-sqs-plugin)
A little bit of manual configuration is required in terms of the AWS SQS plugin, but it should work.
S3 Upload > SNS Notification > Publish to SQS > Trigger Jenkins Build
Ideally it would be straight to Jenkins like so: S3 Upload > SNS Notification > Publish to Jenkins HTTP Endpoint > Trigger Jenkins Build
Hope this helps
We can write a icron job if linux or powershell script if windows, which queries a particular s3 bucket for the given string, if it finds then u can trigger the Jenkins job.
For doing this, the Jenkins instance must be in the AWS itself if we are trying to add an IAM role, if not we need to add aws credentials.
To implement S3 Upload > Publish to SQS > Trigger Jenkins Build (assuming you have appropriate AWS Users, Roles and Policies attached):
Create an AWS SQS Queue
After creating an AWS SQS Queue, on AWS S3 bucket we need to configure:
S3 Bucket "Events" section to register an "Object Create" event
Provide the SQS Queue name. Detailed documentation.
On Jenkins, we need to:
Install Plugin AWS SQS from the Jenkins Install Plugin Page
Configure AWS SQS Plugin to point to SQS queue in Jenkins System Configuration
Configure the Jenkins Job to "Trigger build when a message is published to an Amazon SQS queue"
Note that Jenkins user MUST have Read access to SQS(all Read fucntions) in addition to S3 access.
Now whenever someone adds/updates anything on the bucket S3 sends an event notification the SQS which is then polled by the Jenkins AWS SQS plugin and the respective Job Build is triggered!
This article explains the process in detail AWS to Github to Jenkins. If you are just using S3 then you would skip the Github part.

How to execute spring cloud task using rest-api

I know a cloud task can be scheduled and can be configured using stream also to be executed.
As a developer I want to execute my spring cloud task using rest-api so that I can execute the task on demand.
Basically i have a work flow management system and we are using control-m agent. So now some of the jobs will be executed by control-m and some of the task will be deployed on spring cloud dataflow server. Now when one job completes then other job which is there on cloud has to be executed.
So for this I need the capability to call a rest api and execute the cloud task on demand.
I am sure this feature must be there but I am unable to find an example of documentation.
Can someone please help me.
Thanks in advance.
Please refer to the REST-API guide; specifically, you'd be using tasks/deployments endpoint to operate on an existing task.
Create:
dataflow:>task create foo --definition "timestamp"
Created new task 'foo'
Launch:
curl http://localhost:9393/tasks/deployments/foo\?arguments\=\&properties\= -d ""
p.s: all the supported REST-APIs are listed for your reference and they are accessible at: http://localhost:9393

Resources