How to add dynamic container in storage account [duplicate] - upload

Using Logic Apps I am trying to copy blobs from one container into several separate dynamically created containers however there doesn't appear to be a "create container" action in Logic Apps.
I have tried using the "Create Blob" action with the desired container name specified as part of the "Blob Name" parameter however this fails with a 404 message.
{
"status": 404,
"message": "Specified container telemetery-30dfb0bd-73b0-42a3-8677-63bde2fd4b43 does not exist.\r\nclientRequestId: blahblahh-e60e-44e1-aec4-c32a21659257",
"error": {
"message": "Specified container telemetery-30dfb0bd-73b0-42a3-8677-63bde2fd4b43 does not exist."
},
"source": "blahblha-ne.azconn-ne-01.p.azurewebsites.net"
}
The original request is -
{
"method": "post",
"queries": {
"folderPath": "/",
"name": "/telemetery-30dfb0bd-73b0-42a3-8677-63bde2fd4b43/timeline,xml",
"queryParametersSingleEncoded": "True"
},
"path": "/datasets/default/files",
"host": {
"connection": {
"name": "/subscriptions/blahblah-6866-4c8c-b3f1-41039ad2b3eb/resourceGroups/RG-blahblahg/providers/Microsoft.Web/connections/azureblob"
}
},
"body": "file content"
}
IS there a way to create a blob container us Logic Apps?

According to the documentation, there's no "create container" operation:
https://learn.microsoft.com/en-us/connectors/azureblobconnector/
What you can do is write an Azure Function and chain it as part of your workflow in order to create the container:
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-dotnet#create-a-container

For now there is no action to create blob container, you could implement it with azure function like Thiago proposed. Suppose you could use the rest api to do it. The below test use the sas token to do it you could try other authorize way.

Related

Why is the Azure Devops API telling me that I must have these parameters for my Push request when I already do?

I'm trying to make a Git push request to our Azure Devops server via the API. The address is https://MYSITE.visualstudio.com/MYPROJECT/_apis/git/repositories/2b34d4f7-2c1f-42e7-8861-u0ba34f72b40/pushes?api-version=5.1 and the body is as follows:
{
"commits": [
{
"comment": "Just a dummy commit",
"changes": [
{
"changeType": "edit",
"item": {
"path": "/src/MYPROJECT/MYPROJECT.csproj"
},
"newContent": {
"content": "beans",
"contentType": "rawText"
}
}
]
}
],
"refUpdates": [
{
"name": "refs/heads/TestDummyPRs/upgradeProjectToLatest",
"oldObjectId": "058da4f3328cb1048cb43faf3b5158bc3b025615"
}
]
}
I'm getting the following error:
Web Request Failed after 4 attempts. Request: https://MYSITE.visualstudio.com/MYPROJECT/_apis/git/repositories/2b34d4f7-2c1f-42e7-8861-u0ba34f72b40/pushes?api-version=5.1. Status: BadRequest. Response: Invalid status code [BadRequest]. Response: {"$id":"1","innerException":null,"message":"The parameters are incorrect. A posted push must contain exactly one commit and one refUpdate.\r\nParameter name: newPush","typeName":"Microsoft.TeamFoundation.SourceControl.WebServer.InvalidArgumentValueException, Microsoft.TeamFoundation.SourceControl.WebServer","typeKey":"InvalidArgumentValueException","errorCode":0,"eventId":0}
"A posted push must contain exactly one commit and one refUpdate" doesn't seem entirely reasonable as that's exactly what I have in my body. Does anybody know what might be going on here?
Note that I am having no issues making other web requests, such as creating branches or retrieving file contents.
I expected my web request to proceed smoothly, and to create a Push containing the specified commit to the specified refUpdate.
I have made a manual push for via the Azure Devops web interface and caught the network traffic, and I grabbed the following JSON request out of it:
{
"commits": [
{
"changes": [
{
"changeType": 2,
"item": {
"path": "/src/MYPROJECT/MYPROJECT.csproj"
},
"newContent": {
"content": "beans",
"contentType": 0
}
}
],
"comment": "Just a dummy commit"
}
],
"refUpdates": [
{
"name": ""refs/heads/TestDummyPRs/upgradeProjectToLatest",
"oldObjectId": "058da4f3328cb1048cb43faf3b5158bc3b025615"
}
]
}
This seems to be meaningfully identical to the Push I'm making from my code, other than the enum fields using numerical values instead of text. I have tried my code with numerical values for enums, but that didn't change anything about the error.
I found the issue. The web request from my application was being sent with UTF-16 encoding, whereas Postman had defaulted to UTF-8 encoding. I changed my application to use UTF-8 and it worked.

Azure container instance is not accessible using URL via browser

I have created a new Container instance in Azure. Below are the steps.
Step:1- I created a new Cognitive Services (A Language Service) and used its "Key" and "Endpoint" value inside Container Instance
Step:2- I created a new Container Instance, and provide it all the required information as mentioned in the below article.
https://learn.microsoft.com/en-us/azure/container-instances/container-instances-quickstart-portal
but I changed the PORT 80 to "5001" and Image "mcr.microsoft.com/azure-cognitive-services/textanalytics/healthcare:latest".
Below are env variable I used
{
"name": "Eula",
"value": "accept"
},
{
"name": "RAI_TERMS",
"value": "accept"
},
{
"name": "Billing",
"value": "XXXXXXXXXXXXXXXXXXXXXXXXXXX"
},
{
"name": "ApiKey",
"value": "4a46537f51f64765864cabc20318bdcc"
},
{
"name": "enablelro",
"value": "true"
}
Finally it was created and deployed successfully. Now I tried to access it via below url
http://FQDN:5001/Demo/
FQDN--> qualified domain name is used in the url
its not accessible though instance is up and running properly.
It doesn't matter from which port you are trying to access. instead of using this url http://FQDN:5001/Demo/ would suggest you please use FDQN or IP address of container instance.
Using the complete FQDN when identifying something is the way it is supposed to be.
You can refer this thread same i have reprod related to your question. In which i have used FDQN to access the Conatainer Instance.

AzureBlobStorageOnIoTEdge: Error Target container connection not specified, upload turned off

My local blob storage is not uploading blobs to my cloud storage account. It reports back
"configurationValidation": {
"deviceAutoDeleteProperties": {
"deleteOn": {
"Status": "Success"
},
"deleteAfterMinutes": {
"Status": "Warning",
"Message": "Auto Delete after minutes value not specified, auto deletion turned off."
},
"retainWhileUploading": {
"Status": "Success"
}
},
"deviceToCloudUploadProperties": {
"uploadOn": {
"Status": "Success"
},
"cloudStorageAccountName": {
"Status": "Error",
"Message": "Target container connection not specified, upload turned off."
},
"cloudStorageAccountKey": {
"Status": "Error",
"Message": "Target container connection not specified, upload turned off."
},
"uploadOrder": {
"Status": "Success"
},
"deleteAfterUpload": {
"Status": "Success"
}
}
},
I am pretty sure that it should work. My desired properties are
"deviceToCloudUploadProperties": {
"uploadOn": true,
"uploadOrder": "OldestFirst",
"cloudStorageConnectionString": "DefaultEndpointsProtocol=https;AccountName=*****;AccountKey=******;EndpointSuffix=core.windows.net",
"storageContainersForUpload": {
"***": {
"target": "***"
}
},
"deleteAfterUpload": true
}
The container exists locally and on the cloud site. I copied the primary connection string from my local storage account into the configuration. The local storage is working, I can see that my container was created and contains data but it doesn't want to synchronize with the cloud. Why is it saying "Target container connection not specified, upload turned off."? It sounds like this part is missing
"storageContainersForUpload": {
"***": {
"target": "***"
}
},
but obviously it is not.
I'm using the latest docker image of this service. Is there any chance to use an older version? Some months ago I could make it work already. I tried to use a different version like mcr.microsoft.com/azure-blob-storage:1.4.0 but it doesn't accept any other tags than latest.
Thx!
The difference between my working version of the local blob storage module and my non working version was that the non working version was deployed by a deployment plan. In the deployment plan you cannot just paste the module twin settings of the documentation of the blob storage on IoT edge like https://learn.microsoft.com/en-us/azure/iot-edge/how-to-deploy-blob?view=iotedge-2020-11
You need to split the configuration into two parts where the first part looks like this
and the second part looks like that
And that totally makes sense. If you want to update your modules you probably want to keep your configuration because there might have been some changes which were made e.g. by a customer. This gives you the possibilty to add some properties to your inital configuration later without changing anything what was already configured. In fact every device can keep its individual configuration at any time.
My wrongly configured reported proterties were hidden in the suggested default path "properties.desired.settings" and thus the edge runtime could not find it.

chronograf: Not able to add default influxDB connection when using OAuth 2.0

I configured Chronograph to use generic OAuth 2.0 (using cloud foundry UAA). Users authentication works fine but the problem is that the default influxdb connection is not taken into consideration. In fact this configuration works:
chronograf --log-level="debug" --resources-path="/usr/share/chronograf/resources" --influxdb-url="http://influxDB.log.database:8086" --influxdb-username="usename" --influxdb-password="pass"
here is the content of /usr/share/chronograf/resources folder:
influxdb.src:
{
"id": "9999",
"name": "MyInfluxDB",
"username": "user1,
"password": "password1",
"url": "http://influxDB.log.database:8086",
"type": "influx",
"insecureSkipVerify": true,
"default": true,
"telegraf": "telegraf.autogen",
"organization": "Default"
}
Both connections are automatically created when chronoraf starts :
MyInfluxDB
http://influxDB.log.database:8086
but When I run chronograf with the following options (To use OAuth 2.0 and create an influxdb connection) :
export TOKEN_SECRET="token_secret"; export JWKS_URL="https://uaa/token_keys"; export PUBLIC_URL="http://chronograf:8888"; chronograf --log-level="debug" --resources-path="/usr/share/chronograf/resources" --generic-name="generic" --generic-client-id="id" --generic-client-secret="secret" --generic-scopes="openid" --generic-auth-url="https://uaa/oauth/authorize" --generic-token-url="https://uaa/oauth/token" --generic-api-url="https://uaa/userinfo"
The OAuth 2.0 works fine but once redirected to the chronograf dashboard I cannot see the connections and even when I created a connection manually and I log in I cannot found any connection that is created automatically on startup as wanted.
the field organization needs an id. The id for the Default orginization uses a lower case d. If you change your src file to,
{
"id": "9999",
"name": "MyInfluxDB",
"username": "user1,
"password": "password1",
"url": "http://influxDB.log.database:8086",
"type": "influx",
"insecureSkipVerify": true,
"default": true,
"telegraf": "telegraf.autogen",
"organization": "default"
}
It should now work.
you can see where the id is defined in their source here https://github.com/influxdata/chronograf/blob/9d8a49ba0ef8131cdce22d73718859f55f434db2/bolt/organizations.go#L20

Simple swagger specification, to retrieve an html web page

I'm new to swagger, and I'm trying to make a very simple specification, with only a get method in order to retrieve a web page, this is the code:
{
"swagger": "2.0",
"info": {
"title": "example",
"description": "Sample api to retrieve a web page.",
"version": "0.1"
},
"host":"example.org", #"localhost:8080",
"schemes": [
"https"
],
"paths": {
"/":{
"get":{
"summary":"Return the web page.",
"description":"",
"produces":["text/html"],
"responses":{
"200":{
"description":"OK",
},
"400":{
"description":"Bad request"
},
"404":{
"description":"Not Found"
}
}
}
}
}
}
I'm using the swagger online editor.
Unfortunately when I execute the request, it did not return the web page and no one of the status code that I have implemented in the specification, it return me in the detail section the error:
TypeError: Failed to fetch
Someone can tell me where I'm wrong?
Thank you.
For the "Try it out" button to work in the Swagger online editor, your API endpoints must be CORS-enabled. That is, your server (example.org or localhost:8080) must be configured to return certain response headers that would allow editor.swagger.io to make cross-domain requests to your server. This is explained in more details here:
https://github.com/swagger-api/swagger-ui#cors-support
The way to configure CORS depends on the server/framework used to host the app. This page has instructions for some common web servers:
https://enable-cors.org/server.html

Resources