Which method would be the best to store EndPoints information in programmable repository rather than hard-coding in BizTalk ESB2.0 tool kit?
For this we can use UDDI Portal and to Access the UDDI portal we can to configure UDDI resolver!
Related
In our Stream Analytics Job we have some constant values that are required for further computations. Those are considered "secret" by our customer, so it would be good to not have them set directly in the Query or the User Defined Function we're using. Is there any best practice how to deal with these, e.g. can we somehow retrieve these values from Azure Key Vault?
Unfortunately, Azure Stream Analytics don’t support Azure Key Vault bindings.
I would suggest you to vote up an idea submitted by another Azure customer.
https://feedback.azure.com/forums/270577-stream-analytics/suggestions/35328418-enhance-security-for-asa-managed-services-identit
https://feedback.azure.com/forums/270577-stream-analytics/suggestions/40530247-azure-key-vault-as-reference-data-input
All of the feedback you share in these forums will be monitored and reviewed by the Microsoft engineering teams responsible for building Azure.
You could try to see if Reference Data option in Stream Analytics could be the place where you store the constants and update often. If the concern is about having control over how your queries get encrypted while using ASA, you can use your own storage account to store all private data assets related to the job. And then encrypt your storage account in whatever approach you see fit.
I need to publish messages to GCP Pub/Sub with a POST request as the platform I'm using (Zoho) does not allow for any of the GCP libraries. I'm not sure how to make the request in a simple way, as the normal authentication system seems complex.
Is there an easy way to publish a message using, e.g., an API key?
Alternatively is there a simple way to create an API endpoint within GCP that I can then forward data on to the messaging system?
I have used the python client to publish to Pub/Sub, but cannot make POST requests because of the authentication issues.
Both of your questions will have the same answer, yes, and Google Cloud Endpoints is your way to go here.
With Google Cloud Endpoints you can create a custom endpoint and use API keys to authenticate the requests that are being done. There's a really good how-to guide from medium you can follow in order to set up your endpoint and your Pub/Sub push subscription.
More information about creating push subscriptions can be found in the public documentation.
I am creating iOS application that needs to interact with RESTful API (which will be going to make).
The problem is I have no knowledge in the realm and would like to ask for some helps. (I tried to learn this for few days but as I study more, I get so confused...)
My company has a server that is running with Windows. What is the process of deploying APIs there and use it as data storage as well.
My company has Microsoft 365 license so that I have access to SharePoint. I've read there is SharePoint APIs, so it will be nice to integrate with my app. But more I study about it, I've observed AzureAD. Is it something that I must to have in order to utilize MS SharePoint APIs?
I know it is very broad question but I really appreciate for anyone who provides with helps. Thank you
Yes you will need Azure AD to interact with the sharepoint api, using the oAuth authentication by registering the app into the Azure AD and giving the Sharepoint Management online permissions that are required for you to do the operations that you want.
Can I use D2L as a Tool Provider as well as Tool consumer? Basically if I want to access some learning objects from one D2L instance to another D2L instance - is that possible?
Subrata
Not currently, no: D2L cannot act as an LTI Tool Provider. You may be able to achieve what you describe with a service app running in between two D2L instances, using the Valence APIs to read content objects from one LMS, and write to another LMS; however, in this case, you'd need the service to have user credentials for a user in both instances.
I am starting a new product that will require a .NET based server (using WCF) hosted on Azure. I would like to have basic authentication and security features. The clients are all "rich" UI but are not neccessarily microsoft ones.
We intend to have the first client application written in Silverlight, but we want to keep our options open to implement clients for iOS and Android in the future. So we do not want to use WCF specific features but rather protocols that are easily available on other enviroments.
Of course, with the Silverlight client, we hope to get as much done for us automatically as possible. We intend to only communicate through web services.
Which bindings are recommended for such a scenario?
How would you implement security? (assuming we need basic security - Users being able to log in with encrypted user and password and perhaps some built in basic role management althouh this is optional).
Suggestions?
You could use WCF to implement a REST interface
The binding would have to be a basicHttpBinding (to be open to all platforms) and using SSL to secure the line.
Managing credentials could be done using tokens to be passed back and forth after authentication. Much like a http session. You could pass the token using a cookie but the token could be part of the API or Headers as well. See this Best Practices for securing a REST API / web service
This would grant you the power of .NET and WCF without losing interopability.