dynamic SFTP from pipeline component - adapter

I am trying to create a dynamic sftp from pipeline component for Biztalk 2016. I need to know the namespace which is used in context properties for the creation of sftp connection. I will be configuring the basic port, server, username, password, destination, and connection limit outside of the code and outside of the BizTalk. That is why I need to know namespace that the SFTP location would be using inside the code.

Just look under BizTalk.System and you will find a schema called SFTP.bts_sftp_properties which has a namespace http://schemas.microsoft.com/BizTalk/2012/Adapter/sftp-properties

Related

Storage Event Trigger failed to trigger the function

I am working on pipeline that copies data from ADLS Gen into Azure Synapse Dedicated SQL Pool. I used the Synapse pipelines and followed the Microsoft Docs on how to create a storage event trigger. But when a new file is loaded into the ADLS, I get the following error:
" 'The template language expression 'trigger().outputs.body.ContainerName' cannot be evaluated because property 'ContainerName' doesn't exist, available properties are 'RunToken'.
I have set the following pipeline parameters:
The pipeline successfully runs when I manually trigger it and pass the parameters. I would appreciate any solution or guidance to resolve this issue
Thank you very much
I tried to set trigger the synapse pipeline and copy the new blob into the dedicated pool, but when I monitored the triggers run, it failed to run.
I can trigger the pipeline manually
According to storage event trigger
The storage event trigger captures the folder path and file name of the blob into the properties #triggerBody().folderPath and #triggerBody().fileName.
It does not have property called container name.
As per data you provided it seems that file is stored in your container itself. for this approach you have to give value for container name parameter as #trigger().outputs.body.folderPath it will return container name as folder
And now pass these pile line parameters to dataset properties dynamically.
It will run pipeline successfully and copy data from ADLS to synapse dedicated pool

Azure DevOps secure file guids

In my ADO build pipline, I have a secure file download step. When we branch versions, we use powershell to do the heavy lifting with cloning build definitions and updating settings/info in the cloned pipeline.
One issue I've run into is that the Secure File Download step doesn't accept variables, and in the UI you can only select names of files that already exist, so we've had to manually update it after every new branch we create.
I've grabbed the definition task step in powershell (as $step) and was hoping I could set the $step.inputs.fileInputs to a variable I assign to something like cert-$newVersion, however it currently is set to a guid.
Does anyone know if it possible to get the guid of secure files in ADO via the API or have a solution?
Does anyone know if it possible to get the guid of secure files in ADO via the API or have a solution?
Yes. This API exists.
You could try to use the following Rest API:
Get https://dev.azure.com/{OrganizationName}/{ProjectName}/_apis/distributedtask/securefiles?api-version=6.1-preview.1
Result:
You could get the secure file GUID based on the file name.

Azure Data factory parameters

I have a data factory pipeline which takes the following parameters
Param1
Database Server 1 Name
Database Server 2 Name
Database Server 1 Username
Database Server 2 Username
etc
My pipeline decides via some logic which database server to do an import from.
Essentially I want to deploy 2 versions of my pipeline. 1 Runs in dev and the other in prod.
I want to release a dev and prod version of my pipeline via Azure Devops. Each environment release should provide (via key vault) the values of:
Database Server 1 Name
Database Server 2 Name
Database Server 1 Username
Database Server 2 Username
First prize would be if those values did not even show up any more as parameters in the pipeline. So that triggers would just have to provide Param1. In addition if you manually run the pipeline I also just want to provide Param1.
EDIT: Note that I use the parameters eventually in a paramaterized linked service if that makes a difference (https://learn.microsoft.com/en-us/azure/data-factory/parameterize-linked-services).
I think the key idea to resolve your problem is to use two separate instances of data factory.
In the DEV enironment you have your parameterized connection as you stated above. When taking the code to PROD, you export the template and import it again into the other instance. There you have an additional config file that fills up the values needed to set up the connection properly.
If you want to avoid having the credentials stored in the config file then just add an azure key vault linked service and parameterize the secret identifier accordingly. When you import the template into PROD you even do not need to provide any parameter but the identifier for which secret to grab from key vault.
See here for more info:
devops integration
key vault integration

Is it possible to add the ELB dynamically to Spinnaker

Is there way to add the load balancer name dynamically in Spinnaker? Say for example, I have a Pipeline which does the bake and deploy the AMI in the ASG for different environment. In Deploy Phase for Cluster definition, I don't want to hardcode the ELB name and instead read from properties file.
This I can use the same pipeline for different environment.
Any thoughts or ideas?
Thanks in Advance.
perform the 'add load balancer' action via deck while having the inspection panel open in the network tab. capture the API call. you'll notice there's a json payload. you can modify and subsequently script that action.
Edit the pipeline config and under the Deploy stage edit the appropriate "Deploy Configuration". Select the load balancer use a Spring Expression like:
${ trigger.properties.LB_NAME }
This assumes you have pre-provisioned a loadbalancer that matches the LB_NAME passed in the trigger properties file.

Custom url plugin for Jira

Is it possible to make a plugin for Jira that behaves as custom url?
For example, suppose if I have jira on http://jira.example.com and I want to get some data from e.g. http://jira.example.com/record/{id}, where id is parameter for plugin. And output data is audio stream.
You can create a JIRA plugin with a REST module to display arbitrary content with a URL similar to the following:
http://jira.example.com/rest/record/{id}.
If you prefer, you could write it as a straight servlet module instead, with a URL such as this:
http://jira.example.com/plugins/servlet/record/{id}
If you want to expose an endpoint at the main http://jira.example.com/record level, I am not aware of any way to do that within a plugin. (It should be possible, albeit not very portable, by editing the configuration files in the JIRA program directory.)

Resources