Terraform code for "backup_blob_container_uri" - terraform-provider-azure

resource "azurerm_analysis_services_server" "server" {
name = "analysisservicesserver"
location = "northeurope"
resource_group_name = azurerm_resource_group.rg.name
sku = "S0"
admin_users = ["myuser#domain.tld"]
enable_power_bi_service = true
backup_blob_container_uri = ("https://${STORAGE ACCOUNT NAME}.blob.core.windows.net/${CONTAINER NAME}%s", Blob SAS TOKEN)
*Storage Firewall disable
*Original Error: Code="BadRequest" Message="Invalid backup blob container 'The remote server returned an error: (403) Forbidden.'. Azure blob storage documentation can be found here: https://go.microsoft.com/fwlink/?linkid=2106906"
*I am able to add same container via portal without any error
*I also try to copy and past "Blob SAS URL" directly from storage still the same error

Tested in my environment getting the same error even changing the provider version. Might azurerm_analysis_services_server outdated for backup_blob_container_uri
You can refer this Github Disccusion there was same error reported long back you can follow with them might be there will be new release for this to sort out this error.

Related

SYSTEM$VERIFY_EXTERNAL_OAUTH_TOKEN failed with error EXTERNAL_OAUTH_JWS_CANT_RETRIEVE_PUBLIC_KEY

The below code failed with error Token Validation finished.{"Validation Result":"Failed","Failure Reason":"EXTERNAL_OAUTH_JWS_CANT_RETRIEVE_PUBLIC_KEY"}
SELECT SYSTEM$VERIFY_EXTERNAL_OAUTH_TOKEN('ey...')
Security integration:
create or replace security integration external_oauth_azure
type = external_oauth
enabled = true
external_oauth_type = azure
external_oauth_issuer = 'https://sts.windows.net/xxxxx/'
external_oauth_jws_keys_url = 'https://login.microsoftonline.com/xxxxx/discovery/v2.0/keys'
external_oauth_audience_list = ('https://xxxx.ap-southeast-1.snowflakecomputing.com')
external_oauth_token_user_mapping_claim = 'upn'
external_oauth_snowflake_user_mapping_attribute = 'login_name'
external_oauth_any_role_mode = 'ENABLE';
This issue can happen only if the external_oauth_jws_keys_url value is incorrect.
Are you sure the url used is correct? It should be the value copied from
Azure - App Registration - - EndPoints - OpenID Connect metadata document url, paste this url in the browser.
From that detail,copy the url for jwks_uri and use it for creating the security object.
Please reverify if the value used is correct and compare it once again.

Does not have storage.buckets.list access to the Google Cloud project?

I wrote a Ruby script that will upload an audio file to a Google Cloud Storage.
def upload_to_Google_cloud(audio_file)
project_id = "<project_id>"
key_file = "<json_file>"
storage = Google::Cloud::Storage.new project: project_id, keyfile: key_file
bucket_name = storage.buckets.first.name
puts bucket_name
bucket = storage.bucket bucket_name
local_file_path = "/path/#{audio_file}"
file = bucket.create_file local_file_path, "#{audio_file}.flac"
return "Uploaded #{file.name}"
end
Though, everytime I run the command -> ruby video_dictation.rb, it returns me an error which is xxxxxxxxx does not have storage.buckets.list access to the Google Cloud project. (Google::Cloud::PermissionDeniedError).
Any help, suggestions? Thanks!
Should be permission issue.
Try to create a service account. It looks like this "my-storage-bucket#yourprojectname.iam.gserviceaccount.com"
Go to IAM & Admin -> Permission
Assign that service account with "Storage Object Admin" role.
Try your code again. If is working, please scope down your permission to the below list based on your needs.
5. Remember to download the json key file for that particular service account.

How to get file content on O365 sharepoint folder using graph API

Using Python and the adal and requests packages I'm attempting to use the MS graph API to find files on sharepoint (when providing a sharepoint site name, folder name where the file is expected to be, and name of file.
Using various calls I can manage to do the following
Get an authentication token (using user auth to an app which has full permission to use users credentials and do all read/write on files being accessed)
establish a valid session
search sites and obtain details on my current site
microsoft_info = SESSION.get('https://graph.microsoft.com/v1.0/sites?search=nameOfSite')
Obtain drive information associated with the site
for site in microsoft_info['value']:
if site['displayName'] == siteDisplayNameInput:
siteId = site['id']
drives = SESSION.get("https://graph.microsoft.com/v1.0/sites/"+siteId+"/drives")
drives = drives.json()
Obtain file information from drive of interest
for drive in drives['value']:
if(drive['name']) == folderNameInput:
driveId = drive['id']
files = SESSION.get("https://graph.microsoft.com/v1.0/drives/" + driveId +"/root/search(q='')")
files = files.json()
And then at point 6 everything falls apart and I get 404 errors returned saying that resource is not found - despite using the identifiers provided by the API which are clearly indicating the presence of a resource.
if file['name'] == 'Pipeline Pilot Forms.pptx':
print("List of properties on file")
for x in file:
print(x+" "+str(file[x]))
fileId = file['id']
print(fileId)
callToDLFile = SESSION.get("https://graph.microsoft.com/v1.0/drives/"+driveId+"/items/"+fileId+"/content"
appears to be the code that should work for this (indentation appears to have failed copying things into this, but it is all good) but it returns 404 errors - any help would be greatly appreciated on this, I don't see (in a reasonably lengthy search) anything which matches this issue exactly out there.

Setting GOOGLE_APPLICATION_CREDENTIALS for an MVC site hosted on azure

Title says it all pretty much.
I tried uploading the json file to azure storage and referenced it's url when setting the GOOGLE_APPLICATION_CREDENTIALS environment variable under app settings, but when remotely debugging the site, apparently the url/directory was not in an acceptable format. I can’t store the json file locally either because the website doesn’t have any idea about my C drive directories.
Where should I store this file so that I can set the GOOGLE_APPLICATION_CREDENTIALS environment variable for my azure site to the directory of the json file?
The ToChannelCredentials() approach does not seem to work anymore, so I come up with an other solution that works on Azure. I create a text file in the /bin folder of my Azure server with the credentials and then I point the environment variable to this file. Google Cloud API will use this for the default credentials.
string json = #"{
'type': 'service_account',
'project_id': 'xxx',
'private_key_id': 'xx',
'private_key': 'xxx',
...
}"; // this is the content of the json-credentials file from Google
// Create text file in projects bin-folder
var binDirectory = Path.GetDirectoryName(Assembly.GetCallingAssembly().CodeBase);
string fullPath = Path.Combine(binDirectory, "credentials.json").Replace("file:\\","");
using (StreamWriter outputFile = new StreamWriter(fullPath, false)) {
outputFile.WriteLine(json);
}
// Set environment variabel to the full file path
Environment.SetEnvironmentVariable("GOOGLE_APPLICATION_CREDENTIALS", fullPath);
// Now you can call the service and it will pick up your credentials
TranslationServiceClient client = TranslationServiceClient.Create();
If anyone is wondering how to handle the Google's credentials smoothly in .Net applications instead of strange way of using the file on server, this is how I solved it for Translation Service. Other services must follow same principle:
store the content of the Google credentials json file as an environment variable in settings.json/azure configuration for your app (using ' ' instead of " " for inner text):
"GOOGLE_APPLICATION_CREDENTIALS": "{'type': 'service_account','project_id': ...}"
create and return the client:
var credential = GoogleCredential.FromJson(Environment.GetEnvironmentVariable("GOOGLE_APPLICATION_CREDENTIALS"));
var channelCredentials = credential.ToChannelCredentials();
var channel = new Channel(TranslationServiceClient.DefaultEndpoint.ToString(), channelCredentials);
return TranslationServiceClient.Create(channel);
Took a while for me to figure it our. Hope it helps.
I use the .json file in my local environment (because of environment variable length limit in Windows) and on Azure I use an "Application setting" to set an environment variable. This code handles both cases:
string? json;
var filename = Environment.GetEnvironmentVariable("GOOGLE_APPLICATION_CREDENTIALS");
if (filename != null)
{
json = System.IO.File.ReadAllText(filename);
}
else
{
json = Environment.GetEnvironmentVariable("GOOGLE_APPLICATION_CREDENTIALS_STRING");
if (json == null)
{
throw new Exception(
"GOOGLE_APPLICATION_CREDENTIALS_STRING environment variable with JSON is not set");
}
}
var credential = GoogleCredential.FromJson(json).ToChannelCredentials();
var grpcChannel = new Channel("firestore.googleapis.com", credential);
var grcpClient = new Firestore.FirestoreClient(grpcChannel);
var firestoreClient = new FirestoreClientImpl(grcpClient, FirestoreSettings.GetDefault());
return await FirestoreDb.CreateAsync(FirebaseProjectId, firestoreClient);
Was looking how to set the "GOOGLE_APPLICATION_CREDENTIALS" in Azure App Service. The answers here didn't help me. My solution is very simple without any code change.
In the configuration of the app service go to the Path Mappings
Add a New Azure Storage Mount. eg /mounts/config
Add the credentials.json file to the file share
In the application settings, add the GOOGLE_APPLICATION_CREDENTIALS and set the value to: /mounts/config/credentials.json
That is all.
In the azure app on the azure portal go to application settings and add the credentials under application settings tab
Then you can reference them in your code as they were in your web.config file.

Error 1001 in my Windows Service app setup project

I'm trying to install my Windows Service app using Visual Studio 2008 Setup and Deployment project. I have created a user account that has "Run as a Service" right and have set the ServiceProcessInstaller.Account to System.ServiceProcess.ServiceAccount.User and also the username and password to the user that I have created previously.
spInstaller.Account = System.ServiceProcess.ServiceAccount.User;
spInstaller.Username = "USER NAME";
spInstaller.Password = "PASSWORD";
sInstaller.ServiceName = "SERVICE NAME";
sInstaller.StartType = System.ServiceProcess.ServiceStartMode.Automatic;
But during the setup process, I get the following error:
Error: 1001. The account name is invalid or does not exist, or the password is invalid for the account name specified
Any ideas why I get this error and how I can fix it?
Thanks.
Are you specifying a Domain for your UserName. I.e. if you machine is called FASTCAR have you tried
spInstaller.UserName = "FASTCAR\\UserName";

Resources