My pipeline doesn't start due to error:
(8e45efed0ad51300): Workflow failed. Causes: (8e45efed0ad51e7b): There was a problem refreshing your credentials.
Please check:
1. Dataflow API is enabled for your project.
2. There is a robot service account for your project: [project number]#cloudservices.gserviceaccount.com should have access to your project. If this account does not appear in the permissions tab for your project, contact Dataflow support.
I assume that Dataflow API is enabled as I'm able to reach Dataflow monitoring console, so first requirement is fulfilled. Second isn't. There is no a single account in the domain cloudservices.gserviceaccount.com.
Where can I ask for such help without paid support plan?
If you disable and then re-enable the Dataflow API for your project that should create the missing service account.
Hope that helps!
Related
I'm using the very interesting IoT Edge module Metrics-Collector and its working fine with the Log Analytics Workspace. My question is probably more towards the authentication of the metrics-collector module against the log analytics workspace.
For option 1 in this documentation we need to use the Workspace ID and key here. The possible security issue I see is that the same key would be used in all Edge devices and if the key is compromised some malicious logs could be entered into the monitoring.
Now my question is if there is an option to create some sort of SAS to connect to the Log Analytics Workspace for "inserting" the data. Or is the solution to use option 2 or to rotate the log analytics workspace key and distribute this to the edge devices?
Thanks & regards,
Gerhard
I want to create my first project in Watson Studio; however, I can not do it because when I click on the "Create" bottom I receive the following message "Unable to configure credentials for your project in the selected Cloud Object Storage instance". Does anyone know what this means?
Thanks,
You need to be IBM Cloud account Administrator or Owner if you are using shared account within your organization.
If you are not account Administrator, you should ask account administrator to enable storage delegation for cloud object storage service used for project so that non-administrator users under that account can create project using that Cloud Object storage service.
Please check this documentation link:-
https://dataplatform.cloud.ibm.com/docs/content/wsj/getting-started/projects.html?audience=wdp
(See Requirements section).
https://dataplatform.cloud.ibm.com/docs/content/wsj/console/wdp_admin_cos.html?audience=wdp
There are also chances that this may be a temporary issue caused by some intermediate problem for IBM Cloud IAM service as this project creation attempts to create credentials for the storage service(IBM Cloud Object storage) you select, so you as account owner may also run into this issue. In such case, please reach out to IBM Support.
I've created an Azure DevOps pipeline for building a React Native mobile application for iOS. I have a service connection to Apple's App Store Connect and it uploads the build to TestFlight as expected. I am also passing Release Notes to the AppStoreRelease task and this requires FastLane credentials to get around the Two Factor Authentication.
I tried the approach of creating a fastlane session ID locally and putting that onto the service connection as per https://github.com/fastlane/fastlane/blob/master/spaceship/README.md#support-for-ci-machines. This approach works fine but the session times out and it has to be done again which isn't a sustainable approach.
I've seen documentation on using App Store Connect keys to authenticate (https://docs.fastlane.tools/app-store-connect-api/), but haven't seen any documentation on how to do this for Azure DevOps. Has anyone done this and can provide documentation or a point in the right direction.
I have my .Net apps hosted in Azure Web Apps. Is there any way that I can stream/view the application/server log traces directly using without signing into the portal.
The reason why I need this is because, My contains the fellow developers who will not be having access to the Azure portal.
Please help if any solution for this. Thanks in Advance.
I have tried enabling the log streaming inside the Azure portal.But that doesn't meet my requirement.
Also tried storing the logs to the Azure storage account. But cannot find any opensource tools to fetch and read the logs. And feels this as a time consuming solution.
Mohit's recommendations are great and probably the best advice, however:
I have a suggestion which does not fulfil the requirement of not having a role in Azure, but may offer such an advantage that it could be worth it. Using the Azure CLI you can stream out the logs:
az webapp log tail --name appname --resource-group myResourceGroup
https://learn.microsoft.com/en-us/azure/app-service/troubleshoot-diagnostic-logs#streamlogs
You may be able to setup a role with sufficient constraints that all the developer can do is read the diagnostic logs:
https://learn.microsoft.com/en-us/azure/azure-monitor/platform/roles-permissions-security
Also if you are not familiar with it, I'd suggest looking at Azure Application Insights, it does not have the super low level logs, but likely sufficient for diagnosing issues that a developer would typically run into. And has many advanced features that make it far easier to diagnose things than looking at log files.
https://learn.microsoft.com/en-us/azure/azure-monitor/app/app-insights-overview
Simplest way to achieve that is to use storage account and container for application and diagnostic logging.
To enable diagnostics for your Azure web app, you can do the following:
Log in to your account at https://portal.azure.com/.
Go to your Azure Web App and select Settings > Diagnostics logs.
For Application Logging (Blob), click On and set the parameters.
Set the Level for the logging.
For Storage Settings, click > and select the Storage Account and Container.
This is the Storage Account and Container that Azure will use to store logs for the Web App. Make note of this information because you will need it to set up a log collection job in USM Anywhere. You can click + Storage Account to create a new storage account or container, or select an existing one.
For Web server logging, select Storage.
Click Storage Settings and select the same storage account and container that you set for the application logging.
Once done then you can share the Azure storage container using SAS shared access signature.
SAS will having a URI which will have the predefined access on the container, By this way you will be able to see logs without accessing Azure portal.
A shared access signature (SAS) is a URI that allows you to specify the time span and permissions allowed for access to a storage resource such as a blob or container. The time span and permissions can be derived from a stored access policy or specified in the URI. We’ll cover both of these in detail later.
You can refer below docs for reference.
https://blogs.msdn.microsoft.com/jpsanders/2017/10/12/easily-create-a-sas-to-download-a-file-from-azure-storage-using-azure-storage-explorer/
https://www.red-gate.com/simple-talk/cloud/platform-as-a-service/azure-blob-storage-part-9-shared-access-signatures/
https://www.alienvault.com/documentation/usm-anywhere/deployment-guide/azure/azure-enable-diagnostics.htm
Hope it helps.
I'm trying to set up the TFS2018 search Service, but it reports a problem in the configuration wizard
I tried to change the user account to enable basic autentication in Serach Service and the Service Account, but always shows the same error.
Any suggestions?
Solved:
I needed to use Administrator as search service user, still using the Network service account for the service.
In this scenario, Elasticsearch service are using Network Service.