Stored procedure not getting created using azure pipelines - stored-procedures

When I am trying to deploy stored procedure using azure pipelines, it's not getting created in database,
Would like to add this to existing database by cloning repo in VS19
Please help me with the steps

Related

Jenkins Builds Failing - Failed to connect to repo

Let me begin by stating this entire process was set up by a former employee. I understand how to use Jenkins and set up new items, but that is about the extent of my knowledge. Everything has been working fine for years, but about a month ago all builds started failing.
When looking at the configuration for each job I see this message:
Comparing the console output from successful builds to that of failed builds I also notice some differences. I do not know what they mean though.
A successful build:
Then a few days later the same job failed to build. I do think there were plugin updates or something done in between.
Can anyone help me solve this to get our development flow back up and working properly? When files are pushed from Bitbucket it automatically kicks off a Jenkins build which pulls the files into our staging server. Since Jenkins is not working correctly I have to manually FTP any new files to our staging server which takes a lot of time.
It seems that you are missing the credentials for the Github repository.
Jenkins as extensive documentation on how you can add a credential secret:
https://www.jenkins.io/doc/book/using/using-credentials/
Here is a simple tutorial for it:
https://www.thegeekstuff.com/2016/10/jenkins-git-setup/#:~:text=Setup%20Jenkins%20Credentials%20for%20Git&text=To%20add%20a%20credential%2C%20click,Use%20default.

CI/CD integration problem when using google-cloud-build with github push as trigger for Cloud Run

I am trying to set up a CI/CD pipeline using one of my public GitHub repositories as the source for Cloud Run (fully-managed) service using Cloud Build. I am using a Dockerfile initialized in root folder of the repository with source configuration parameter initialized as /Dockerfile when setting up the cloud build trigger. (to continuously deploy new revisions from source repository)
When, I initialize the cloud run instance, I face the following error:
Moreover, when I try to run my cloud build trigger manually, it shows the following error:
I also tried editing continuous deployment settings by setting it to automatically detect Dockerfile/cloudbuild.yaml. After that, build process becomes successful but the revision are not getting updated. I've also tried deploying a new revision and then triggering cloud build trigger but it isn't still able to pick the latest build from container registry.
I am positive that my Dockerfile and application code are working properly since I've previously submitted the build on Container registry using Google Cloud Shell and have tested it manually after deploying it to cloud run.
Need help to fix the issue.
UPPERCASE letters in the image path aren't allowed. Chnage Toxicity-Detector to toxicity-detector

Configure Jenkins CI build to use TFVC hosted in Azure DevOps

We recently migrated from an on-premise TFS server to Azure DevOps. Our team uses TFVC for source control, and I'm getting the following exception when Jenkins polls for new check-ins:
FATAL: This server requires federated authentication but no mechanism was available to handle it.
com.microsoft.tfs.core.exceptions.TFSFederatedAuthException: This server requires federated authentication but no mechanism was available to handle it.
Given the exception class name is TFSFederatedAuthException I suspect Azure is expecting some sort of OAuth integration, but Jenkins doesn't appear to support that for TFVC.
All I did was change the Collection URL for that Jenkins build to https://dev.azure.com/MyCompany. The Project path remains the same, and I verified this, because I was able to re-map all of my TFVC branches in Visual Studio by just pointing to the different collection URL and keeping the same project path. A screenshot of the Jenkins source control config is below:
This Jenkins server is internal with no public facing IP address or host name.
How can I allow Jenkins to poll a TFVC repository hosted in Azure DevOps in order to trigger a CI build in Jenkins?
Why not use Azure pipelines? That's a much bigger migration effort at the moment, and I'm just trying to solve a short term problem.
Using Azure pipelines is my long term goal, but I need to figure out how our automated tests can use an Oracle database first, because all data is deleted before each test is executed using Selenium.
Azure DevOps uses OAuth to communicate by default, putting in your username and password won't work because of that. Instead, the trick is to generate a Personal Access Token (I suspect the Code|Read+Write scope should do it) and pass that in.
For the username pass in ., for the password your generated personal access token. Give the token a nice name so you know which one is about to expire once you get the email notification.

Azure storage plugin in jenkins create folder structure on azure blob

I'm having my Jenkins freestyle job which builds and deploy the angular project on azure blob storage.
Everything worked fine, but when a job succeeded, it creates a full folder structure on the blob.
Because my angular project build is in a subfolder. I provide a full path to the files which I need in my azure blob.
Jenkins post-build action
and it gives me directory structure in the azure blob.
Azure blob storage
I need my angular build files(assets, js, etc.) directly in the $web blob.
Actual requirement
Looks like this is a bug of windows azure storage Jenkins plugin which is maintained by Visual Studio China Jenkins Team as mentioned here. So may be report bug or feature request as instructed here.
As a workaround you may try Azure CLI Jenkins plugin and accomplish your requirement by using az storage blob upload-batch Azure CLI command in your Jenkins job or you may also try with AzCopy utility as well. (or) If you use Jenkins pipeline then you can do something like below. So overall it's like not a good solution for freestyle job but better approach via pipeline job.
dir(‘AAA\BBB\CCC\DDD\EEE\’) {
azureUpload … filesPath: 'files'…
}
Hope this helps!!

Continuous Deployment Bluemix with existing Bitbucket repo

I'm experiencing an issue with Bluemix DevOps continuous integration system when it comes to linking the project to an existing private Bitbucket repository.
I tried the steps presented in this link and although I'm able to see the content of the Bitbucket folder, the devOps is still stuck to the initial commit and it does not take the appropriate files during the building stage.
Can anyone provide any tips or suggestions?
Many thanks
Those instructions, although it mentions "private", are only valid for a Public repository, as you have noted without a similar style to the GitHub integration we don't have a way to setup a shared token you would need to be able to authenticate. That said today, you cannot really use a pull method to pull changes from BitBucket to DevOps Services, you will need to use a push method from somewhere that you can authenticate to your private repo and the DevOps Services git repo to keep them in sync.
https://developer.ibm.com/answers/questions/197619/continuous-delivery-with-bitbucket-and-jazzhub.html

Resources