How to package and deploy Flink SQL Table application from AWS Kinesis Data analytics - devops

We have a flink application using flink sql in aws kinesis data analytics.
Zeppline provides was to export jar to s3 and then create a application from that jar in aws.
However how do we integrate this with CI/CD.
1.We need to put our code in git. How do we export the code ? as zplyn files ? or a sql files ? zplyn files
are not recognised by sonar cube
2. How do we build those applications ?
3. Once we have jar we can create application using aws create-application command , but building jar and setting the environment properties right is what we want to know
thanks
We tried build the appliction using aws zeppline dashboard it builds and runs , but cannot integrate it with git ci/cd

Related

Is there a way to deploy Cloud Functions directly as zip artifacts to Google Cloud Platform? and not rely on the default Cloud Build?

The default setup for firebase functions is to run firebase deploy, which will:
Upload the whole project to Cloud Build
Cloud Build will extract the functions
It will run npm install.
Create the ZIP artefacts
Upload the ZIP artefacts to the cloud
The question is if you know of a way to make these ZIP artefacts on our side and upload them directly?
Default Cloud Build steps
List of the Cloud Build deployments
From my point of view - there are plenty of options how to deploy one or more cloud functions.
The Deploying Cloud Functions documentation provides some initial context.
The easiest way, from my point of view, to use gcloud functions deploy command - see Cloud SDK CLI - gcloud functions deploy
As a side note - from my personal point of view - an idea to use Cloud Build - is not bad, and it has many benefits (security, organization of CI/CD, etc.) but it is up to you. Personally I use Cloud Build with Terraform, and configured deployment in a such a way, that only updated Cloud Functions are redeployed.
There is different "level" of cloud build to take into consideration.
If it's the first step, I mean create a ZIP with the code of your function, no problem, you can do it on your side. Then you can deploy the zip through the console or with api calls
In both case, you need a zip and you deploy it. And, if you use the gcloud functions deploy command, it do exactly the same thing: create a zip, send it to storage and deploy the function from that ZIP!
That's was for the first stage, where you manage the ZIP creation and sending on the cloud.
HOWEVER, to deploy the ZIP code to Google Cloud Platform you need to package that code in a runnable stuff, because you only have a function and a function isn't runnable and can't handle HTTP request.
Therefore, Google Cloud run a Cloud Build under the hood and use Buildpacks to package your code in a container (yes, all is container at Google Cloud), and deploy that container on the Cloud Functions platform.
You can't discard that container creation, without container your Cloud Functions can't run. Or you can use Cloud Run and build your own container on your side and deploy it without Cloud Build.

Google cloud build python apache beam data flow yaml file

I am trying to deploy an apache beam Data Flow pipeline built-in python in google cloud build. I don't find any specific details about constructing the cloud build.YAML file.
I found a link dataflow-ci-cd-with-cloudbuild, but this seems JAVA based, tried this too but did not work as my starting point is main.py
It requires a container registry. Step to build & deploy is explained in the below link
Github link

How to upload content to WildFly welcome-content via jboss-cli-client?

I am using Jenkins builder and have WildFly server. How do I upload content to WildFly welcome-content folder via jboss-cli-client.jar, that I am using for uploading deployments?
I can enter the deploy command to Jenkins batch, now I would like to upload the welcome-content. I searched command, but did not find any pointers what it could be.

How to change data source of deployement in jenkins?

I am using jenkins to automate build and deployment of a java web application. Currently I am able to automate build and deployment of the application in tomcat using jenkins. But, I also need to change the datasource.properties file of the application in order to point to a specific schema. Is there any plugin to do that?

Unity3D + Jenkins Configuration

I'm using Unity3D, Jenkins and Bitbucket private repository. Jenkins is installed in cloud service. I want to automate buildings and probably run some tests in Jenkins before pushing to repository.
I have configured Jenkins properly to such step that it works when pushing some changes to repository, but this configuration doesn't contain configuration of Unity3D Build Plugin. In order to get Unity3D Build Plugin working properly you must configure it by providing the directory of your Unity3D Editor. Problem here is that Jenkins is installed in cloud service and Unity3D Editor on my computer. If you want to configure Unity3D Build Plugin you must provide it's .exe file location. So my question is that is it possible to tell Jenkins that is installed in cloud service that Unity3D installation directory is located at my computer? If yes how it is done?
right now it's a limitation on the plugin. I hope to fix this. Open a bug in the issue tracker (https://issues.jenkins-ci.org/browse/JENKINS) and let's move the conversation there!

Resources