Retrieving Full sized images stored in a Image type column in a dataverse table through Azure API Management - odata

We are storing images in a dataverse table under the Image type column When trying to retrieve it using an OData call through API management it only returns the image at icon size(144x144).
I have tried to use [Organization URI]/api/data/v9.1/accounts(C0864F1C-0B71-E911-8196-000D3A6D09B3)/myentityimage/$value?size=full but the clients are not authenticated to the backend so while it works in testing it wont work in a production environment.
We build up our OData query using FetchXML. It gives us the option to select the image which returns it in BASE64 but only as a icon(144x144).
Is there any way using this method to return the full image. I cannot find how to append the Size=Full
FetchXML we are using.
<fetch>
<entity name="[Entity Name]">
<attribute name="bt_attachment" />
</entity>
</fetch>
OData query that it generates.
https://[Organization URI].crm4.dynamics.com/api/data/v9.2/[Table Name]?$select=bt_attachment
The image column is the only one return.

Related

Azure Data Factory read url values from csv and copy to sql database

I am quite new in ADF so thats why i am asking you for any suggestions.
The use case:
I have a csv file which contains unique id and url's (see image below). i would like to use this file in order to export the value from various url's. In the second image you can see a example of the data from a url.
So in the current situation i am using each url and insert this manually as a source from the ADF Copy Activity task to export the data to a SQL DB. This is very time consuming method.
How can i create a ADF pipeline to use the csv file as a source, and that a copy activity use each row of the url and copy the data to Azure SQL DB? Do i need to add GetMetaData activity for example? so how?
Many thanks.
use a look up activity that reads all the data,Then use a foreach loop which reads line by line.Inside foreach use a copy activity where u can able to copy response to the sink.
In order to copy XML response of URL, we can use HTTP linked service with XML dataset. As #BeingReal said, Lookup activity should be used to refer the table which contains all the URLs and inside for each activity, Give the copy activity with HTTP as source and sink as per the requirement. I tried to repro the same in my environment. Below are the steps.
Lookup table with 3 URLs are taken as in below image.
For-each activity is added in sequence with Lookup activity.
Inside For-each, Copy activity is added. Source is given as HTTP linked service.
In HTTP linked service, base URL, #item().name is given. name is the column that stored URLs in the lookup table. Replace the name with the column name that you gave in lookup table.
In Sink, azure database is given. (Any sink of your requirement is to be used). Data is copied to SQL database.
this is the dataset HTTP inside the copy activity
This is the input of the Copy Activity inside the for each
this is the output of the Copy Activity
My sink is A Azure SQL Database without any tables yet. i would like to create auto table on the fly from ADF. Dont understand why this error came up

Is it possible to overwrite the Standard Variant in SmartVariantManagement?

I created a smart table using the columns provided by OData service and the corresponding metadata. I also added the smart variant management to the smart table by setting the properties useVariantManagement, useTablePersonalisation and persistencyKey. I can now see the variant management control and create new variants. So far so good.
However, since there is no way to define the column width in the annotation file for the OData service (or is there?), the columns have a predefined width in the standard variant. Therefore, I am resizing the column widths in the afterRendering event of the smart table to consider the length of the data provided by the OData service. The problem I now face is that by resizing the columns, I am adjusting the standard variant and am therefore getting a modified flag (so it says Standard*).
I can see that it is not possible to adjust or delete the standard variant in the frontend but is there any way to save the new layout as the standard variant in the backend?
You can overwrite column widths client side in the XML:
<smartTable:SmartTable entitySet="Setname" ...>
<t:Table alternateRowColors="true" visibleRowCountMode="Auto" selectionMode="MultiToggle">
<t:columns>
<t:Column width="5em" sortProperty="Columnname" filterProperty="Columnname" app:p13nData='\{"columnKey": "Columnname", "leadingProperty": "Columnname"}'>
<t:template>
<Text text="{Columnname}"/>
</t:template>
<Label text="{/#Setname/Columnname/#sap:label}"/>
</t:Column>
...

Listing docker image in reverse chronological order using Artifactory API

I am trying to use the below API to get the list of docker images so that I could populate the dropdown on Jenkins build. Is there a way that this could be listed in a reverse chronological order rather than alphanumerical so that the newest image is at the top ? Thanks.
/artifactory/api/docker/repo/v2/image/tags/list
You will have to take help of Artifactory AQL query language.
An example AQL fragment is...
items.find({"repo":{"$eq":"<repositoryname>"}, "name":{"$eq" : "<artifactoryItemName>"}}) .sort({"$desc" : ["created"]})
The descending sort order is specified with $desc sort operator on the timestamp field created.
You can also limit the number of results returned by adding an extra limit to the above query...
items.find({"repo":{"$eq":"<repositoryname>"}, "name":{"$eq" : "<artifactoryItemName>"}}) .sort({"$desc" : ["created"]}).limit(10)
The AQL needs to be submitted at /artifactory/api/search/aql.
The same can be done via REST API as well with a POST request. The content should be posted not as a JSON but directly in the way query is specified as-is as text. The header for content type is Content-Type:text/plain. You can use the Basic authentication or other supported authentication methods.
There are a ton of things you can do with AQL. The syntax can look a bit confusing to begin with.

Retrieving More columns as Part of VSTS query

I'm trying to fetch details from VSTS using VSTS query API. So to get all Portfolio Epics I created a custom query and used its ID to get that in JSON format. the query looks like this
https://dev.azure.com/{organization}/{project}/{team}/_apis/wit/wiql/{id}?api-version=5.0-preview.2
But the issue is its not giving me many details about each of the work items in JSON. It only lists the ID and URL. Like this
WorkItems:[
{ID:234,URL:"workitemurl"},
{ID:235,URL:"workitemurl"},
{ID:236,URL:"workitemurl"},
...
]
So if I need more details about an item I need to execute those individual URl for each PE and thus I can get its details. instead of I am just checking is there is any way of getting an ID (keyedinID of each work item along with the ID and URL) like this. Please note KID is a field if we execute the URL separately. So to avoid that extra process, I would like to get that along with the WorkItems.
WorkItems:[
{ID:234,URL:"workitemurl",KID:002},
{ID:235,URL:"workitemurl",KID:023},
{ID:236,URL:"workitemurl",KID:033},
...
]
So how can we make this possible?
The Web UI uses a different API to get query results (/_api/_wit/_query), which allows query+data in a single pass. This is an old __v5 type call, which means it's considered internal.
The proper way to do this now is to first do the query as you're doing it right now and then call /_api/wit/workitems?ids=1,2,3,4 using the IDs from the references you got from the first call. That will also allow you to load the details dynamically and in small batches which will result in a more responsive UI.
See:
https://learn.microsoft.com/en-us/rest/api/azure/devops/wit/work%20items/list?view=azure-devops-rest-4.1

Not able to filter messages based on header properties in Azure Stream analytics

I have created an Azure Stream Analytics (ASA) job to filter data based on a custom header property i send from a client app.
How would i read/filter message header properties in Azure stream analytics?
The portal return no results when i try to test out my query. Below is my query in azure portal.
So far this is my query as simple as this:
SELECT
*
INTO
[mystorage]
FROM
[iothubin]
WHERE Properties.type = "type1"
I also tried to call out the key without its parent (such as: where type = "") with no results as well.
I am sure that i am sending messages with this custom property in the header since i can view it using device explorer tool.
any idea how to get this working?
I haven't tried this yet myself, but supposedly you can access custom properties via GetMetadataPropertyValue(). Give this a try:
https://msdn.microsoft.com/en-us/library/azure/mt793845.aspx
You can use the query described here as an example to query complex schemas.
If you share your schema, we can look at the query for you.
Let me know if it works for you.
Thanks,
JS

Resources