CQRS, event sourcing and a translated application - translation

I am working on an application (CQRS + event sourcing) that should support multiple user languages. The user will have the ability to translate some of his input different languages. E.g. some labels or descriptions can be given in Dutch and/or English. Depending on the language preferences of the user, the application should show the correct translation.
I suspect the read model plays a big role in this process.
I was thinking of creating events like ItemDescriptionTranslated, telling 'The description of item X was translated to language Y as Z'.
I would think that the aggregate can safely ignore this kind of events, and that only the read models should do something with this information.
Does this make sense? Does any of you have experience with CQRS/ES in a translated application? Any hints are greatly appreciated.

Of cource you can use event sourcing. You can code aggregate's build funcs to ignore ItemDescriptionTranslated event.
The main question is - if you really need event sourcing in this part of application. For example you can build authorization using both ways - es or not. If you want to log all users' login and auth, you prefer ES. But if you want login only, without any analysis - i suggest not to use es.
So, do you want to collect some additional info about translating? When, who, maybe check some statistics about corrections by different authors and so on.

Related

Is Appium only used for testing? What about an app/bot assistant?

i'm looking for some guidance on what the best approach is/ what software I need.
My bf owns a promotional company, books about 15 shows a month and spend a lot of time on his phone, creating event pages, linking up the bands facebook and bandcamp url to events, contacting bands with set times and backline info, etc. I thought I would help him out (and other music promoters that I've talked to) with automating some of these tasks. I know there might be other apps currently out there that have some of these capabilities, but I wanted to create something myself as it be a fun project that I could practice my programming skills(beginner-ish).
The app/bot will act as an assistant. It should be able to create an event, ask for the date and let the user input band names. Maybe there will be a series of check boxes that the user can select whether the band is from out of state(what state), touring, local, EP/album release, etc. Select co-host, location. Then depending on the # of bands and the location it will also generate a set time list, and backline info that the user can copy it and send to the bands(or maybe it will do this automatically, if it has access to the bands instagram or facebook?) Once the user presses done it will automatically create an event page, with date, and all the other info. Under the description of the event page, the app will automatically populate that with info depending on what the user selected. For example if its 2 out of state bands and 2 local bands, then it will write something like: "We welcome our 2 touring friends XX from FL? and xxx from NY? With local support from XX and xx" - something along those lines with a link to their pages.
Maybe in the future the app will also be able to go in and look at the bands facebook pages, go to their events and see what people pressed going/interest in their events, then invite those people to their upcoming show?
How do I get started. Should I use Appium and android studio to develop this app/bot assistant? Is Appium only used for testing? I have some basic programming experience from college and other courses i took, such as Selenium webdriver. More important is this doable or am I setting myself up for failure?
In addition to being a developer, I also am a part-time promoter and perform all of those same tasks that your boyfriend does. If there was some way to automate any of it, I would have done so a long time ago. Unfortunately there is, in my opinion, too much variability between different events, venues, and musical acts.
The biggest time sink for me is tracking down information about each act such as their web site URL, Facebook page URL, Soundcloud profile URL, logo, bio, and photo. (Some of the artists I deal with have a press kit but most don't.) Once the information is compiled, creating the Facebook pages is the easy part.
So, maybe a better use of time would be to create a web form that collects that information from each act and stores it in a folder in a structured way.

Check Site URL which fills data in Report Suite in SiteCatalyst (Omniture)

This question may seems odd but we have a slight mixup within our Report Suites on Omniture (SiteCatalyst). Multiple Report Suites are generating analytics and it's hard for us to find which site URL is constituting the results.
Hence my question is, is there any way we can find which Site is filling data within a certain Report Suite.
Through this following JS, I am able to find which "report suite" is being used by a certain site though:-
javascript:void(window.open("","dp_debugger","width=600,height=600,location=0,menubar=0,status=1,toolbar=0,resizable=1,scrollbars=1").document.write("<script language=\"JavaScript\" id=dbg src=\"https://www.adobetag.com/d1/digitalpulsedebugger/live/DPD.js\"></"+"script>"));
But I am hoping to find the other way around that where Report Suite gets its data from within the SiteCatalyst admin.
Any assistance?
Thanks
Adobe Analytics (formerly SiteCatalyst) does not have anything native or built in to globally look at all data coming to see which page/site is sending data to which report suite. However, you can contact Adobe ClientCare and request raw hit logs for a date range, and you can parse those logs yourself, if you really want.
Alternatively, if you have Data Warehouse access, you can export urls and domains from there for a given date range. You can only select one report suite at a time but that's also better than nothing, if you really need the historical data now.
Another alternative is if your sites are NOT currently setting s.pageName, then you may be in some measure of luck for your historical data. The pages report is popped from s.pageName value. If you do not set that variable, it will default to the URL of the web page that made the request. So, at a minimum you will be able to see your URLs in that report right now, so that should help you out. And if you define "site" as equivalent of "domain" (location.hostname) you can also setup a classification level for pages for domain and then use the Classification Rule Builder and a regular expression to pop the classification with the domain, which will give you some aggregated numbers.
Some suggestions moving forward...
I good strategy moving forward is to have all of your sites report to a global report suite. Then, you can have each site also send data to a site level report suite (warning: make sure you have enough server calls in your contract to cover this, since AA does not have unlimited server calls). Alternatively, you can stick with one global report suite and setup segments for each site. Another alternative is to create a rollup report suite to have all data from your other report suites to also go to. Rollup report suites do not have as many features as standard report suites, but for basic things such as pages, page views, it works.
The overall point though is that one way or the other, you should have all of your data go into one report suite as the first step.
Then, you should also assign a few custom variables to be output on the pages of all your sites. These are the 4 main things I always try to include in an implementation to make it easier to find out which sites/pages are reporting to what.
A custom variable to identify the site. Some people use s.server for this. However, you may also want to pop a prop or eVar with the value as well, depending on how you'd like to be able to break data down. The big question here is: How do you define "site" ? I have seen it defined many different ways.
If you do NOT define "site" as domain (e.g. location.hostname) then I suggest you pop a prop and eVar with the domain, because AA does not have a native report for this. But if you do, then you can skip this, since it's same thing as point #1
A custom prop and eVar with the report suites(s). Unless you have a super old version of legacy code, just set it with s.sa(). This will ensure you get the final report suite(s), in case you happen to use a version that uses Dynamic Account variables (e.g. s.dynamicAccountList).
If you set s.pageName with a custom value, then I suggest you pop a prop and eVar with the URL. Tip: to save on request url length to AA, you can use dynamic variable syntax to copy the g parameter already in a given AA request. For example (assuming you don't have code that changes the dynamic variable prefix): s.prop1='D=g'; Or, you can pop this with a processing rule if you have the access.
you can normally find this sort of information in the Site Content-> Servers report. There will be information in there the indicates what sites are sending in the hits. Your milage may vary based on the actual tagging implementation, it is not common for anyone to explicitly set the server, so the implicit value is the domain the hit is coming in from.
Thanks C.

Expanding a website - providing different contents across different places

I am working on a website. Currently the website was targeted to serve users from a specific Geographic region. Now I would like to expand its userbase to another region. The need is to serve different contents to different regions with the same base functionality.
My initial thought (I might sound a noob here) is to host the content specific to different regions on different databases -> Redirect users to specific domains and thus map the users geographically. Do suggest if its the right way to proceed.
Also, I would like to know whether there is a need to localize my website for these regions (Current language used is English)
Please post your experiences in such scenarios and also your ideas to bring about the transition.
Thanks in advance.
How do you see users being matched to their specific regional content?
Will they be presented with an option to choose?
Will you use geo functions to determine location?
Will you use server based reverse DNS lookup to determine location?
Will each region get its own "entry" URL (aka different domains)?
The first three are fraught with their own specific problems...
Presenting a choice/menu is considered bad form because it adds to the number of "clicks" necessary for a user to get to the content they actually came for.
While geo functions are very widely supported in all modern browsers, it is still seen as an issue of privacy in that a large number of users will not "allow" the functionality, meaning you'll have to fallback to a choice/menu approach anyway.
Server based reverse DNS, while a common practice, is very unreliable because many users are using VPN, proxies, TOR, etc. to specifically mask their actual location via this method of lookup.
Personally, my experience is to use completely separate entry URLs that are all hosted as virtual domains on a single Web Server. This gives you a large array of methods of determining which entry URL was used to access your code, and then format/customize the content appropriately.
There is really no need to setup separate servers and/or databases to handle these different domains/regions.
With that said, even if the language is common across regions, it is a very good habit to configure your servers and databases to support UTF-8 end-to-end, such that if any language specific options need to be supported in the future, then you won't need to change your code to do so. This is especially true if your site will capture any user generated input.

Custom replacement strings from third party app

Is there any ability to populate a learning module's content using data passed from a third party application. For example:
Third party data:
userid = 12, username = Sally, user_q1_answer = Jim, user_q2_answer
= 101
Module Content setup:
[[username]], since you are in room [[user_q1_answer]], you should
contact [[user_q2_answer]] in the event of the fire alarm going off.
Module Content Delivered:
Sally, since you are in room 101, you should contact Jim in the event of the fire alarm going off.
Thanks for any help
Currently, no facility in the LMS exists to do this kind of dynamic substitution at render time. A number of other questions here have covered this ground. As of Spring 2013, this kind of functionality is on the development roadmap but there is not yet a committed release vehicle for it.
It might be possible to use a client-side browser extension to detect specially formatted strings in page content and make Valence Learning Framework API calls to find values it can replace those strings with. However, this technique would probably only practically be able to replace values that are known about the current user and their relationship to the LMS. Through URL and page content examination, it might also be possible to gather knowledge about the user's current browsing context (i.e. what course or course section they're looking at), but we never recommend screen-scraping because you can't depend on meaningful tokens or data appearing reliably going forward (where as you can depend on the Learning Framework APIs to be able to get you information about the current operating user).

How would I go about routing visitors based on location in rails?

I'm working on a rails app that needs to route users to a specific URL based on their location. Preferably something that will present them the appropriate content based on location with the ability for them to be able to view content for other locations.
Specifically, think of the location interface for Craigslist... Users are presented content from the city they are in and still allowed to select and view another city.
I've seen a few posts that answer parts of this question, but I'm trying to plan out the best solution.
It looks like there is going to need to be something, probably cookie based, that sets a 'default' location for a given user and still allows them to select other locations.
Again, just looking for concept/planning assistance and any direction on any gems that might be applicable.
Thanks in advance!
http://dev.maxmind.com/geoip/geolite is a free geo-ip database that works pretty well. It makes some mistakes (it put a client's office of mine in Kirkland, WA when they are in fact in downtown Seattle, WA). Certainly is good enough for Craigslist level specificity since you'd be re-routing both those people to "seattle" anyway. There's a ruby gem for it as well - "geoip-c". It's very easy to use.
The other option would be to use HTML5's "gimme your location" functionality. More intrusive for the user, but might be more specific.

Resources