Deploy an Azure Function using azurerm_function_app_function - terraform-provider-azure

I have created a new C# Azure Function (EventHub trigger) and added my own logic to handle the events that come in. My issue is that I am not sure how to deploy this new app via terraform. I have created a azurerm_function_app_function resource, and from reading the terraform docs I can see that I can add a file (https://registry.terraform.io/providers/hashicorp/azurerm/latest/docs/resources/function_app_function#file) however, for a CSharp project, that's not using csx, I don't see how this can be used.
Is there a way to deploy a C# application using terraform, or should I split these components up, and have terraform create the azurerm_function_app_function and then have a pipeline that uploads my code to the function?

Related

Synapse Devops Deployment failing when using user managed identity configured for different environments

we are facing a challenge as to how to handle the User-Assigned-Managed-Identity with the CICD pipelines in Azure DevOps when the UAMI are not the same across environments?
Dev, Test and Prod have different subscriptions therefore each synapse implementation can have its own Azure subscription and possibly different UAMI configured.
When we publish the templates from synapse it generates a separate credential.json file and it is not part of the parameterization template.
The question is how can we override the parameters when there is a separate credential file.
We tried to update the linked service with resourceid which contains UAMI details so that it becomes part of the parametertemplate file but it is not working.

Programmatically Create Runner for BitBucket Pipelines

Is it possible to programmatically create and register a runner in bitbucket pipelines, in other words without having to create it first via the BitBucket UI.
The docker command provided requires a runner UUID, which must be created when creating the runner through the UI. Is there a way to programmatically create it through the BitBucket API? It seems a bit backward to have to create the runner first just to get the UUID so you can then deploy it.
With GitHub Actions Self Hosted runners, a runner can be created and registered to GitHub using a temporary token, but it does not seem like BitBucket have a dopted this approach, at least yet.
At the time of writing the Bitbucket API does not allow for this. There are two open feature requests for Bitbucket Runner APIs, BCLOUD-21708 and BCLOUD-21309, that may benefit from some votes.

How do you connect Service Now with Forge to create a new BIM 360 project?

I've used the process documented here https://github.com/Autodesk-Forge/forge-bim360.project.setup.tool to automate creating a new project HUB. We ask our users to fill a form on Service Now for each new project. My question is: Has anyone had any success linking the Service Now API to Forge? So that the variables that are filled by the end user on the Service Now form are used by Forge to create a new project HUB. Thanks.
I am not familiar with Service Now, but I've created integration with internal system.
Integration depends on what you need.
1.You can create a plugin for Service Now(If it is supports adding plugins) ,and create an app to fill into info that you need to create new project in Bim360(project name,users,etc.)
2.If you need to create a new project based on your info in Service Now,you should take Service Now Api, read all info that you need and then pass it to Autodesk Forge Api to create new project.

Is there a way to track usage of a global shared library in Jenkins?

Context:
At my work most developers are free to write their own Jenkinsfile for their own team's projects.
As the Jenkins admin, I provide developers with a global shared library.
Most projects are using either v1 or v2 or v3 or another version of this library, using the idiom library("theSharedLib#v#").
Question: Is there a way for me to find out which Jenkinsfile is using which version of the shared library without having to actually lookup into all those Jenkinsfile files (50+ files in as much git repos)?
What I would see best is some mechanism that write up (into a file on the Jenkins master or in a DB) which project/Jenkinsfile is using which version at the time the library is loaded.
A possible solution would be to add some code to every function inside the library that will actually do this reporting. I could then see which function is used by who. Any better solution?
I wrote https://github.com/CiscoDevNet/es-logger to gather information such as this from Jenkins. It has a plugin that will run a regex against the console log of a completed job and can then post events to elastic search.
Jenkins helpfully posts library loads at the start of the log such as:
Loading library sharedLib#version
So a simple regex like
"^Loading library\S+(?P<library_name>.*?)#(?P<library_version>.*?)\S+$"
added to the console_log_events plugin would generate events in an elastic search for each usage and each version.

Performing denodo tasks from Jenkins

I am trying a create a working prototype for performing denodo activities from my Jenkins server.
Steps that i want to perform are :
Import a VSQL file from GIT to Denodo from Jenkins.
Create a view in Denodo from Jenkins.
Run this VSQL file in Denodo from Jenkins.
I am new to Denodo world and i am not sure if Denodo has any APIs for doing this.
Can someone let me know if this is really possible? If so where can i find a solution for this requirement. I tried searching in the internet for last few days, but couldn't find a solution.
The problem why you don't find to much on the web for this is that the files and query language in denodo is called vql not vsql. Try searching for that, you will find a lot there.
Anyways about your problem:
You have two options to work with CI and CD in Denodo. If you use Jenkins and just want to create views based on actions in other systems, e.g. create a base view as soon as a new table is created in the source you can just send the vql create script (containing create wrapper an create view) via jdbc or odbc to the server. For that create a technical user on denodo and load the driver to the jenkins server.
The other option is if you are using Denodo 7 to use the solution manager. There you have a rest API where you can create Revisions, test them on different environments and deploy them. Not sure if you can create a revision based on vql code that comes from Jenkins, but I think this should be possible.

Resources