How to bring complex Application Insights to a Farmer deployment? - f#

I got interested in Farmer and decided to try it out in my project. I managed to replace most of the ARM template with Farmer.
However, there are Application Insights left, as I have quite a complicated setup there, including some alerts, scheduled query rules and so on. Whereas everything that Farmer currently supports for AI is just name, IP masking and sample percentage.
How can I plug in AI setup into the Farmer so that I don't reject Farmer just because of that part? My service looks like this:
let webApp = webApp {
name appName
service_plan_name servicePlanName
sku WebApp.Sku.B1
always_on
...
}

So webApp setup has a builder keyword for this, link_to_unmanaged_app_insights:
Instructs Farmer to link this webapp to an existing app insights instance that is externally managed, rather than creating a new one.
However there are no examples and only one test using it, so after some experimenting, this is what proved to work:
Keep having AI setup in ARM in the source code, e.g. arm-template-ai.json.
Note the AI resource Id.
Use the aforementioned keyword in the F# app setup:
let webApp = webApp {
name appName
service_plan_name servicePlanName
sku WebApp.Sku.B1
always_on
link_to_unmanaged_app_insights (ResourceId.create appInsightsName)
}
In the release pipeline, first deploy AI from the AI ARM template in the source code.
Then deploy all other resources from the ARM template generated by Farmer.
An example how it can look in Azure DevOps:

Related

Scenario where I have to create an application that might have more destinations in future

I was just wondering: is it possible to add more destinations/services dynamically in my UI5 application rather than using it in manifest?
For example: let us assume currently I know of a service A that I'm going to use.
So I will add that service, create model in manifest and consume through OData.
But what if in future the requirement changes, and now they want to be able to select system as well, so that the service A from that particular system is selected and data is fetched from that destination only? (Assuming service is same in all systems).
Services defined in the manifest.json help you during initialization of your app.
It shouldn't be a great problem to define a new model at runtime in the controller, e.g.
var oModel = new sap.ui.model.odata.v2.ODataModel("http://services.odata.org/Northwind/Northwind.svc/");
And set it as new View Model
this.getView().setModel(oModel);
or change the bindingContext of a Control to it
oControl.setBindingContext(oContext,"myModelName");
If you encounter CORS-Problems and have to define destinations in the
SAP Cloud Platform Cockpit upfront, then this might be a bit more difficult. But in your case (service is same in all systems) that won't be a problem.

Multiple web-interfaces for same neo4j database

Note: I want solutions only for neo4j community edition, not the enterprise one. Thanks!
I want to use the default web interface http://localhost:7474/browser/ for development and read/write purposes Also, I would like to use another web interface which I will apparently open to public for read purposes, which may go by certain different port say, 8474.
I tried this:
- Used two instances(neo4j folders) - a) read_only = true b) read_only commented out.
- Changed http/https ports for the both to differentiate.
- Changed org.neo4j.server.database.location property in 'read_only' one to point to the location of 'read/write' one.
This doesn't work. Any workaround? I just want two web-interfaces for the same database. One read only. One read/write supported.
Setup a cluster of 3 Neo4j enterprise instances (or 2 instances plus one arbiter) and set read_only=true on one of the instances.
See http://neo4j.com/docs/stable/ha-setup-tutorial.html for detailed setup instructions.

Running 2 instances of Adobe Analytics in parallel with DTM

I'm trying to migrate my Analytics implementation to DTM by building a second instance and have the old and the new one run in parallel for a while before disabling the old one.
When I browse the site in with DTM staging mode enabled to test the setup, I get collision issues between the two instances. The alternate tracking code variable I specified for the new version is not being defined. Instead of getting a server call for each instances, i get 2x calls from the old one, with a mix of variables values from both designs.
All the setup so far is done throught the DTM interface, no custom code
Adobe Analytics tool library config:
Code configuration = custom
Code Hosted = In DTM
Tracker Variable Name = s2 (old one uses "s")
The old instance is configured through AEM5.6.1 with H.25
the new one use whatever DTM includes by default
What would be the way to dissociate the 2 instances?

Where best to create company wide groups in OFBiz?

I am trying to create branches of a company (and then hopefully teams within branches) in Ofbiz. I had a look at the HR app, and whilst it does list a company and some departments and other stuff in the main view, I haven't been able to find a way to modify this org tree to remove or add to it. The only thing I can think off is to delete/modify this information in the DB, but I'd rather not resort to such tinkering (if it is indeed actually possible to do it this way)
This can be done with Relationsships in the Partymanager application.
(See https://localhost:8443/partymgr/control/EditPartyRelationships?partyId=Company if you're running a local default installation)

How to run a story multiple times with different parameters

I have developed a jBehave story to test some work flow implemented in our system.
Let’s say this story is called customer_registration.story
That story is a starting point of some other more complex work flows that our system supports.
Those more complex work flows are also covered by different stories.
Let’s say we have one of our more complex work flows covered by a customer_login.story
So the customer_login.story will look somehow like below:
Story: Customer Login
Narrative:
In order to access ABC application
As a registered customer
I want to login into the application
Scenario: Successfully login into the application
GivenStories: customer_registration.story
Given I am at the login page
When I type a valid password
Then I am able to see the application main menu
All works perfectly and I am happy with that.
3.The story at point 1 above (customer registration) is something I need to run on different sets of data.
Let’s say our system supports i18n and we need to check that customer registration story runs OK for all supported languages, say we want to test our customer registration works OK with both en-gb and zh-tw
So I need to implement a multi_language_customer_registration.story that will look something like that:
Story: Multi language customer registration
Narrative:
In order to access ABC application
As a potential customer
I want to register for using the application
Scenario: Successfully customer registration using different supported languages
GivenStories: customer_registration.story
Then some clean up step so the customer registration story can run again
Examples:
|language|
|en-gb |
|zh-tw |
Any idea about how I could achieve this?
Note that something like below is not an option as I do need to run the clean up step between runs.
GivenStories: customer_registration.story#{0},customer_registration.story#{1}
Moving the clean up step inside the customer registration story is not an option too as then the login story will stop working.
Thanks in advance.
P.S. As you could guess in reality the stories we created are more complex and it is not an easy task to refactor them, but I am happy to do this for a real gain.
First off, BDD is not the same as testing. I wouldn't use it for every single i18n scenario. Instead, isolate the bit which deals with i18n and unit test that, manually test for a couple and call it done. If you really need more thorough then use it with a couple of languages, but don't do it with all of them - just enough examples to give you some safety.
Now for the bit with the customers. First of all, is logging in and registration really that interesting? Are you likely to change them once you've got them working? Is there anything special about logging in or registration that's particular to your business? If not, try to keep that stuff out of the scenarios - it'll be more of a pain to maintain than it's worth, and if it's never going to change you can just test it once, manually.
Scenarios which show what the user is logging in for are usually more enticing and interesting to the business (you are having conversations with the business, right?).
Otherwise, here are the three ways in which you can set up a context (Given):
By hacking the data (so accessing the database directly)
Through the UI (or controller if you're automating from that level)
By using existing data.
You can also look to see if data exists, and if it doesn't, set it up. So for instance, if your customer is registered and you don't want him to be registered, you can delete his registration as part of setting up the context (running the Given step); or if you need him to be registered and he isn't, you can go through the UI to register him.
Lastly, JBehave has an #AfterScenario annotation which you can use to denote a clean-up step for that scenario. Steps are reusable - you can call the steps of the scenario from within another step in code, rather than using JBehave's mechanism (this is more maintainable anyway IMO) and this will allow you to avoid clearing registration when you log in.
Hope one of these options works for you!
From a tactical standpoint, I would do this:
In your .story file
Given I set my language to {language}
When I type a valid password {pass}
Then I am able to see the application main menu
Examples:
|language|pass|
|en-gb |password1|
|zh-tw |kpassword2|
Then in your Java file,
#Given ("I set my language to $lang")
#Alias ("I set my language to {language}")
// method goes here
#When ("I type a valid password $pwrd")
#Alias ("I type a valid password {pass}")
// method goes here
#Then ("I am able to see the application main menu")
most unit testing frameworks support this.
Look how mstest you can specify DataSource, nunit is similar
https://github.com/leblancmeneses/RobustHaven.IntegrationTests
unfortunately some of the bdd frameworks i've seen try to replace existng unit test frameworks when it should instead work together to reuse the infrastructure.
https://github.com/leblancmeneses/BddIsTddDoneRight
is fluent bdd syntax that can be used with mstest/nunit and works with existing test runners.

Resources