I'm using the very interesting IoT Edge module Metrics-Collector and its working fine with the Log Analytics Workspace. My question is probably more towards the authentication of the metrics-collector module against the log analytics workspace.
For option 1 in this documentation we need to use the Workspace ID and key here. The possible security issue I see is that the same key would be used in all Edge devices and if the key is compromised some malicious logs could be entered into the monitoring.
Now my question is if there is an option to create some sort of SAS to connect to the Log Analytics Workspace for "inserting" the data. Or is the solution to use option 2 or to rotate the log analytics workspace key and distribute this to the edge devices?
Thanks & regards,
Gerhard
Related
I've created an Azure DevOps pipeline for building a React Native mobile application for iOS. I have a service connection to Apple's App Store Connect and it uploads the build to TestFlight as expected. I am also passing Release Notes to the AppStoreRelease task and this requires FastLane credentials to get around the Two Factor Authentication.
I tried the approach of creating a fastlane session ID locally and putting that onto the service connection as per https://github.com/fastlane/fastlane/blob/master/spaceship/README.md#support-for-ci-machines. This approach works fine but the session times out and it has to be done again which isn't a sustainable approach.
I've seen documentation on using App Store Connect keys to authenticate (https://docs.fastlane.tools/app-store-connect-api/), but haven't seen any documentation on how to do this for Azure DevOps. Has anyone done this and can provide documentation or a point in the right direction.
I have my .Net apps hosted in Azure Web Apps. Is there any way that I can stream/view the application/server log traces directly using without signing into the portal.
The reason why I need this is because, My contains the fellow developers who will not be having access to the Azure portal.
Please help if any solution for this. Thanks in Advance.
I have tried enabling the log streaming inside the Azure portal.But that doesn't meet my requirement.
Also tried storing the logs to the Azure storage account. But cannot find any opensource tools to fetch and read the logs. And feels this as a time consuming solution.
Mohit's recommendations are great and probably the best advice, however:
I have a suggestion which does not fulfil the requirement of not having a role in Azure, but may offer such an advantage that it could be worth it. Using the Azure CLI you can stream out the logs:
az webapp log tail --name appname --resource-group myResourceGroup
https://learn.microsoft.com/en-us/azure/app-service/troubleshoot-diagnostic-logs#streamlogs
You may be able to setup a role with sufficient constraints that all the developer can do is read the diagnostic logs:
https://learn.microsoft.com/en-us/azure/azure-monitor/platform/roles-permissions-security
Also if you are not familiar with it, I'd suggest looking at Azure Application Insights, it does not have the super low level logs, but likely sufficient for diagnosing issues that a developer would typically run into. And has many advanced features that make it far easier to diagnose things than looking at log files.
https://learn.microsoft.com/en-us/azure/azure-monitor/app/app-insights-overview
Simplest way to achieve that is to use storage account and container for application and diagnostic logging.
To enable diagnostics for your Azure web app, you can do the following:
Log in to your account at https://portal.azure.com/.
Go to your Azure Web App and select Settings > Diagnostics logs.
For Application Logging (Blob), click On and set the parameters.
Set the Level for the logging.
For Storage Settings, click > and select the Storage Account and Container.
This is the Storage Account and Container that Azure will use to store logs for the Web App. Make note of this information because you will need it to set up a log collection job in USM Anywhere. You can click + Storage Account to create a new storage account or container, or select an existing one.
For Web server logging, select Storage.
Click Storage Settings and select the same storage account and container that you set for the application logging.
Once done then you can share the Azure storage container using SAS shared access signature.
SAS will having a URI which will have the predefined access on the container, By this way you will be able to see logs without accessing Azure portal.
A shared access signature (SAS) is a URI that allows you to specify the time span and permissions allowed for access to a storage resource such as a blob or container. The time span and permissions can be derived from a stored access policy or specified in the URI. We’ll cover both of these in detail later.
You can refer below docs for reference.
https://blogs.msdn.microsoft.com/jpsanders/2017/10/12/easily-create-a-sas-to-download-a-file-from-azure-storage-using-azure-storage-explorer/
https://www.red-gate.com/simple-talk/cloud/platform-as-a-service/azure-blob-storage-part-9-shared-access-signatures/
https://www.alienvault.com/documentation/usm-anywhere/deployment-guide/azure/azure-enable-diagnostics.htm
Hope it helps.
My pipeline doesn't start due to error:
(8e45efed0ad51300): Workflow failed. Causes: (8e45efed0ad51e7b): There was a problem refreshing your credentials.
Please check:
1. Dataflow API is enabled for your project.
2. There is a robot service account for your project: [project number]#cloudservices.gserviceaccount.com should have access to your project. If this account does not appear in the permissions tab for your project, contact Dataflow support.
I assume that Dataflow API is enabled as I'm able to reach Dataflow monitoring console, so first requirement is fulfilled. Second isn't. There is no a single account in the domain cloudservices.gserviceaccount.com.
Where can I ask for such help without paid support plan?
If you disable and then re-enable the Dataflow API for your project that should create the missing service account.
Hope that helps!
I'm using Google Cloud Platform. I have a Google App Engine project with its code stored in a git repo in my Google Developer Repository. I then use YouTrack to track bugs and would like to integrate it with my Google Developer Repository VCS. I'm able to use git repos with YouTrack, but it requires an Oauth2 Token.
It seems Oauth2 tokens are available for most Google APIs, but I don't know what API should be used by a 3rd party tool wishing to watch for commits. I assume this is the same problem faced by those wanting to use Jenkins to monitor their Developer Repo and perform testing and deployments accordingly.
How is this normally done? (ie get Oauth2 token and allow repo access to a 3rd party tool)
YouTrack only supports integration with GitHub/GitLab/Bitbucket directly, so solving the login issue does not unfortunately make any sense.
A workaround would be to use TeamCity or Upsource as sort of a bridge between YouTrack and your VCS. For more details see https://stackoverflow.com/a/9190486/469159. The answer only mentioned TeamCity, since Upsource had not been released at that moment.
I've taken over development of a Google Analytics API dashboard for a content management platform, and upgraded the code to use OAuth2 as the older oauth was disabled recently. The authentication flow and subsequent API calls are all working fine on my localhost for development.
The problem is when trying the code from a different domain. Google wants the redirect_uri to be whitelisted through the developer console, and if it isn't there, it throws Error: redirect_uri_mismatch
As this is a self-hosted (+ open source) package, people will be able of installing on their own servers, there is no way I'll be able of adding all possible redirect_uri values to the app key in the developer console.
After a bunch of Googling and trying to understand the docs, I get the impression there are 2 possible solutions.
Instruct users to go to the Google Developer console, and to create an app key of their own, before also going through the OAuth2 flow within the distributed app to provide the code access to the data in Google Analytics.
Use a redirect_uri value of urn:ietf:wg:oauth:2.0:oob with an Installed App key, instructing people to copy/paste the code back into the self-hosted app after authentication.
Neither of these are really appealing as it adds a bunch of complexity for the user (though option 2 sounds mostly doable). Are there other options, or am I simply overlooking something simple?
You actually don't have any choice in this matter. You must go with nr 1. When you state this is a dashboard and web application it leads me to believe this is some kind of scripting language. This means that the client id and client secret will be displayed to your users / customers. This is against googles terms of service.
Changes to the Google APIs Terms of Service Asking developers to
make reasonable efforts to keep their private keys private and not
embed them in open source projects.
You may not release your client id and client secret to your users they are going to have to create there own. Which nicely solvers your redirect URI problem they have to make there own.
Further reading Can I really not ship open source with Client ID?