Finding delete entity calls to Azure storage table - storage

Is there a way to find out if there was any delete entity call to a azure table in last 'N' minutes? Basically my goal is to find all operations that updated the table in last 'N' minutes.
Update: I am looking for a way to do it with a rest api call for a specific table in the storage.

If using Azure Portal an option, you can find this information via Metrics. For example, see the screenshot below
]
Basically here I am taking a sum of all transactions against my table storage where API call was DeleteEntity.
You can find more information about it here: https://learn.microsoft.com/en-us/azure/storage/common/storage-metrics-in-azure-monitor?toc=%2fazure%2fstorage%2fblobs%2ftoc.json.
UPDATE
If you wish to get this information programmatically, I believe you will need to use Azure Monitoring REST API. I looked up the request sent by Portal and it is sending a request to /subscriptions/<my-subscription-id>/resourceGroups/<my-resource-group>/providers/Microsoft.Storage/storageAccounts/<my-storage-account>/tableServices/default/providers/Microsoft.Insights/metrics/Transactions endpoint.
UPDATE 2
For a specific table, the only option I can think of is to fetch the data from Storage Analytics Logs which is stored in $logs blob container and then parse the CSV file manually. You may find these links helpful:
https://learn.microsoft.com/en-us/rest/api/storageservices/storage-analytics-log-format
https://learn.microsoft.com/en-us/rest/api/storageservices/storage-analytics-logged-operations-and-status-messages#logged-operations

Related

How to expand all custom field in Jira API

when I make a call to JIRA API the response contain a large number of custom field with different IDs.
It's possible resolve it directly into API response?
I just try to call api on /rest/api/2/field to obtain all jira custom fields, but after that I should do a manual mapping script, it's soo annoing. There is a solution for this problem?
In the next time I need to export, for every issue, antire activity, attacchments and comments to store in external archive that can be consult in future.

Fluentd + azure data explorer cluster

I'm working on fluentd setup in kubernetes. In kubernetes I have a number of applications which are writing some logs into stdout. I can filter, parse, and send logs to azure blob storage. But I want the logs from blob storage to be ingested into azure data explorer cluster. In data explorer cluster I have a database and table which has some schema defined, already. The question is how do I modify event from fluentd in such a way that it's going to meet the table schema? Is it possible at all? Maybe there are some alternative ways of creating such setup?
Take a look at ingestion mappings, you can pick the properties that you care about and route them to the applicable columns and when a new property arrives you can change the mapping and the table schema will automatically be updated.
Yes it is possible to do this. You can ingest data stored in your blob to a custom table on azure data explorer. Refer this link
https://learn.microsoft.com/en-us/azure/data-explorer/ingest-json-formats?tabs=kusto-query-language#ingest-mapped-json-records
The below is an example where i ingest a JSON document stored in blob to a table in ADX
.ingest into table Events ('https://kustosamplefiles.blob.core.windows.net/jsonsamplefiles/simple.json') with '{"format":"json", "ingestionMappingReference":"FlatEventMapping"}'
If the schema is difficult to parse, i would recomment to ingest first to a raw table(Source Table). Then you can have a update policy to move this data into different tables after parsing. You can check this link to understand about Update policy
Consider using the ability to listen on blobs landing in storage using the event grid mechanism. Check out https://learn.microsoft.com/en-us/azure/data-explorer/ingest-data-event-grid-overview

People API in Microsoft Graph not returning the updated results

We recently updated the profile information in Azure active directory. when we use People API in graph explorer, it is not returning the latest information. How much time does it take normally to return the updated information?
The /me/people API is actually returning from a search index, not a real-time fetch from Azure Active Directory directly. The search index update depends on many characteristics of your AAD. It is safe to say you should see this reflected within 24 hours.
Obviously there is value in the people API with its underlying logic. If you wish to have the data updated instantly. You should use the /users/ to fetch the latest content.

Caching strategy of parse.com

One of my application is using Parse.com as its backend service. There is one table called Product which is queried through cachePolicy as kPFCachePolicyCacheElseNetwork. The problem is the client side always got the cached data even after I modified some of the fields. The reason I don't always get data through network is that I'm trying to save data traffic as much as possible.
My question is if there is a way to expire the cache in server side so that I'll get new data in client side as soon as I modify the data? Thanks (My only solution by far is to delete the client app and reinstall it.This obviously is not an ideal solution.)
You need to decide some time limit on the cache validity, generally on the client, and either call clearCachedResult on the query instance or clearAllCachedResults on PFQuery when the limit is exceeded.
You could create a cloud function which returns a minimal amount of data and informs the app about changes so it can decide how / when / which caches to remove. For example, you pass a list of class names and last requested dates and the cloud function returns the names of classes which have new data since those dates.

Ruby on Rails, sharing data between applications via a central REST API store

So here is the basic structure I'm proposing:
Data warehouse (for want of a better word)
E-commerce online
Back-end MIS
etc
So the idea is that I have an Order for example. An order can be created via e-commerce site, or via back-end MIS. Either case the order should filter out to e-commerce to show order to user, and vise versa.
There will be other apps in the future.
So the thinking is, to have a central warehouse that wraps this data in a service API, and then the other apps push / pull to it.
Sound OK? I guess the question is syncing the data. When I create an order, do I push the order at create time to the warehouse, or put it to some queue, or is there some other method to keep all these in sync, assuming, near realtime to realtime sync is required.
Assume your REST server is just another data store. How would each client get updates from a plain old database when needed?
If you had each client poll the data store at regular intervals, that would be one solution.

Resources