How to use google analytics data to create a dynamic recommendation engine - machine-learning

I am building a recommendation engine and I want that more dynamic. I wanna use google analytic data along with data I have to give users a good recommendation by capturing user's details automatically. is there any particular method for doing that or to blend both google analytic data and local data to work my recommendation engine more dynamically?

You can collect the search terms in the engine inside your site (if they are defined in the URL, i.e. in querystring, you can specify the parameter and collect them in the dedicated report, otherwise if they are inside the path you can use a filter and extract them).
You can then add them manually or dynamically to your database to offer better search results.

Related

How to extract conversion results based on different attribution models from the Google Adwords API

I need to extract data from the adwords API to allow me to see the impact of different attribution models on conversion. I'm visualising my adwords data in the Klipfolio tool. My adwords account is set up to use the position based model but I'd also like to show last click conversions. This is essentially the data which is available in the adwords manager interface. I'm making my API calls via the supermetrics tool but can also create them directly in Klipfolio.
I'm not entirely familiar with the adwords query language but it does look like that there is a field for attribution type.
I've got as far as extracting data via this, using the Klipfolio interface:
SELECT
Date,Conversions,ConversionTypeName
FROM CAMPAIGN_PERFORMANCE_REPORT
DURING {date.add(-7).format('yyyyMMdd')},{date.today.format('yyyyMMdd')}
Exactly what I'm looking for is WHERE function where I can specify a different type of attribution model from the current.
I cannot obtain this data from our Google Analytics data as our conversion data from GA is, unfortunately, duplicated. We're using the adword's conversion tags de-duplication functions to get around this issue.
I'd be very grateful if anyone could share an example of how an API request could look with the attribution model field present or, indeed, give some feedback on whether this is even possible.
Hi you can find more information about the CAMPAIGN_PERFORMANCE_REPORT report here
You can find there the WHERE and SELECT fields like this:
SELECT CampaignName, Clicks, Impressions, Cost
FROM CAMPAIGN_PERFORMANCE_REPORT
WHERE Impressions < 10
DURING LAST_30_DAYS'
Usually the reason for the duplications it's the selection of SELECT parameters, but I cannot say this for sure without more information.

On blackboard is it possible to open multiple scorms?

I'm having difficulty finding way to open multiple scorms simultaneously..
Is there a setting that I need to change or blackboard does not allow multiple scorms to be opened?
So commonly with SCORM you have a Runtime API that is nested somewhere on the LMS web page. This API is directly flavored for the SCO that was requested. So having multiples trying to talk to the same API would lead to data corruption. The only way to extend the capability for multiple shareable content objects would require more IFRAMEs. You'd need to nest the runtime in the first IFRAME, then the Content in a sub IFRAME. I've accomplished this with another client and it was done mainly to represent single page SCOs stacked on top of each other as a Q and A style homework display.
Is it optimal? Probably not. Does it work? Totally. The SCORM Runtime is a JavaScript based API which is populated with a CMI object. Namespaces within that CMI Object (get / set) from the content, however the LMS is responsible for populating its base values as well as maintaining whats allowed and not allowed. Hope that helps, but based on your question Blackboard would have to implement that sort of module support to do it.

Can I use Adobe's (Satellite) Dynamic Tag Manager to set basic Adobe Analytics (Omniture) vars?

Is it possible to use Adobe DTM (aka Satellite Tag Manager) to set basic Omniture/Adobe Analytics variables, like pageName, so we don't need to place this code on the page?
You will need to load the "Analytics" tool to DTM to begin setting up Adobe Analytics variables. The nice thing about DTM is you do not need to migrate all of your analytics code to DTM all at once, you can migrate in steps as long as you tell DTM analytics code already exists on the page.
There are great videos here (under Dynamic Tag Management) which walk you through the setup of DTM and the configuration of the Analytics tool. https://outv.omniture.com/
One of the biggest benefits of DTM is to be able to do this in a very easy-to-do manner. You can create data elements to define the different pieces of data, and then use those data elements to populate the SiteCatalyst variables in rules. It greatly simplifies an implementation and is much faster than having to add code to each page. Plus it removes the need for developers having to put props, eVars, and events on a page. They just put a data layer (preferably in JSON) with all of the data you want to send to SiteCatalyst. The structure of the data layer is unimportant, and they can use any naming format they want.

API to search inside video content

I want to know what API is avaiable to search inside video archive by specifying a text query (like TalkMiner does).
There is no API for search specifically but you can quickly cook an application like Talkminer by joining two components - speech recognition engine and an indexer. For speech recognition you can use any avaialble engine like CMUSphinx, you first transcribe all the audio you need to search in.
For indexing you can use any indexing engine like Apache Solr. Just put the automatic transcriptions into the indexer and provide the query interface.
You can check Koemei (https://koemei.com/). It's a service that you can use to index on demand videos and search inside them for specific moments of interest. Koemei also has search embed code for videos which you can embed on any website. And the service provides API you can use.

How best to aggregate site statistics (especially search demand)

I am working on a rails application that uses sunspot solr for search. I have been asked to log (or capture in some way) each search that happens on the site; the query, the user that did the search, the result count that resulted from their search...etc, so that the company can report on what people are searching on (demand), and other things.
Before I go and make a table, that will receive an ever-growing number of rows of search data, I'm wondering if anyone has done this in a better way? Can I use analytics (google?) in some way for this? Is there some kind of service I can send this information too, such that we could easily pull reports, or create reports from?
In short, is there some better/smarter way than creating my own table and storing this all in our own DB?
I had never done this, but here are some thoughts.
If you just need to store that data I think you should do it yourself.
If you need to also provide a way to analyse the data yes, see if there is anything already done (I'm not sure but it seems google analytics only support internal search using their search bar).
If your client already have some BI tool they just need a way to access the data, and it would be easier to have it in a owned DB wich you can easily be query instead of using a provider api.

Resources