For my Grails application I want to set up Google Analytics to track only "partial" url's. I 'll explain:
a typical Grails url consists of the following parts: domain + application-name + controller + action + id
e.g. www.mydomain.com/myapp/controller/action/12345
As far as I understand for Google Analytics the page to be tracked is identified by the entire url. For my purpose I'm not interested in the id part of the url: I want to know which actions have been performed, but I need not know for which id the action was executed.
And of course I would like a generic solution, because I have multiple controllers and multiple actions... Maybe some kind of filter stating "I want to track pages 3 levels deep (/myapp/controller/action)" would do? Or a filter stating "exclude everything from url after the last /"?
Any help would be much appreciated.
Kind regards,
Pieter
I think this issue is best solved within the realm of Google Analytics, where you can create a specific report that ignores the id-part of the url.
That way you can just use Google Analytics at its easiest and need not make any code changes to your project
There can be several approaches. First thing comes to my mind is taking one of these steps:
Using profile filters (more info)
Generating the same virtual pageviews on each action id (more info)
Using advanced segments tool with a proper condition (page url pattern match) (more info)
Each approach has its pros and cons, choosing proper one depends on the goal you are trying to achieve.
I think this question is best answered by this article.
As the other contributors suggested, I too thought the issue should be resolved in Google Analytics. I clicked around a bit and got hopelessly confused.
Solving the issue within Grails is much much easier. In short the answer is:
in the Google Analytics tracking javascript there is a
"_trackpageview" action
this action can take as parameter the url you want to track
in Grails I can simply add the stuff I want to track:
application/controller/action
my Google Analytics script is in my main template
I just use: _gaq.push(['_trackPageview','myapp/${controllerName}/${actionName}']);
("myapp" should be the name of your application)
(${controllerName} and ${actionName} are generically available
variables in the Grails views)
Hope this will help others.
Thnx for the other answers.
Related
Which URL structure should I use for my Web-app?
Clean URLs like this
http://dashboard.company.com/sales/john-doe/2017/32
or with URL parameters?
http://dashboard.company.com/sales?person=john.doe&year=2017&week=32
Are there any guidelines for this?
Edit to explain my question better From the user perspective, the two ways are identical in ways of sharing the url. For the programming part they are not, I use Flask. I want know if there's a standard way of handling it, what is the better way?
Background
I am developing a Sales Dashboard for internal use at my company. It display the sales of every sales person. I want to make the reports shareable so that my colleagues can send their own page for a certain weeknumber with each other, or whatever. Or the boss can easily get the page for a meeting with the sales person.
No SEO
Just to stress this point. I don't need clean URLs for SEO.
It doesn't matter at all, by adding the parameters as GET or POST they will be visible but if you use a framework for your app, you should use clean as possible because the parameters to the controllers must be specific and not by data. Otherwise if is not a big project you can use like that but you need to make sure that soon you wont have something like lang?en or something which will be as main parameter. It's up to you, read GET x POST differences and you'll figure it out better.
I've been doing some programming off and on for my brother, who is a stock trader. I'm wondering if it is possible to receive a push notification when a site server adds a page. For example, the site smallcapfortunes.com frequently adds pages that are simple extensions off the main URL. For example, the site regularly adds pages under URLs such as /neca/, /stev/, etc.
Are there existing methods to execute this? Or is this something I need to write myself? Has anyone here written anything like that?
I know there are existing sites to track basic updates to a single page. In my research, though, I haven't found anything like this.
Please let me know if there are any other details I need to provide.
Generally you can only get a push notification if a specific website offers that service.
Some websites publish a structured (XML) site map. If the one you're interested in does that, you could pull that sitemap on a regular basis and look for differences.
you're most likely going to want to use http://scrapy.org/ to go through the site and find new /neca/ and /stev/ urls, etc, then just trigger the script every so often.
hi i am trying to build a simple application using grails wherein i need to crawl 3 websites to get data abt the price off the book.And after getting those details when i select to buy it has to redirect to tht selected site.example refer the link http://www.mydiscountbay.com/ I am stuck i dont know hw to implement a simple crawler in grails.pls guide me with a sample code or tutorial on hw to implement it
thanks in advance
Implementing crawler has nothing to do with grails, there are some opensource java crawlers that you may be able to use or customize as per your need. Front end part would be like a normal grails web app.
Using something like URL#getText() will not get you very far with webs that have redirections, cookies, etc.
For anything even a little bit involved, use commons HttpClient, or the groovy HttpBuilder.
http://hc.apache.org/httpcomponents-client-ga/index.html
http://groovy.codehaus.org/HTTP+Builder
To parse the response and extract content, use XmlSlurper, eg: Using XmlSlurper: How to select sub-elements while iterating over a GPathResult
For an event in a couple of weeks I'd like to make an web page/app which display tweets from a specific user, a specific hashtag and all #reply's at the first user in 3 boxes on the screen.
However I've never tried this. I want to use either .NET (C#) or HTML/CSS/JS since I'm proficient in those. Are there any libraries/API's I can use? Or is there an readily available freeware/open-source app I can use?
Have you seen TweetSharp?
Use Twitter's profile and search widgets. Profile for the first box, a search of the hash tag for the second box, and a search of to:username for the third box.
I actually just posted this as an answer to another question:
I just updated a plugin to work with the Twitter 1.1 API. Unfortunately, per Twitter's urging, you will have to perform the actual request from server-side code. However, you can pass the response to the plugin and it will take care of the rest. I don't know what framework you are running, but I have already added sample code for making the request in C#, and will be adding sample code for PHP, shortly.
The plugin makes a call to statuses/user_timeline, but you will likely want to look at statuses/filter or statuses/search, instead. All you will have to do is add your desired parameters (hashtag, replies, etc.) to the server-side code and it should work (with the addition of your security keys and tokens, of course).
Good luck! :)
Basically I want to know how many people have tweeted a link to a url, but since there are dozens of link shortener out there I don't see any way to do this without having access to all of their url maps. I found a previous question here but it was over a year old and didn't have any new answers.
So #1, does anyone know of a service/API for doing this?
And #2, can anyone think of a way to accomplish this task other than submitting the long url in question to all the popular link shortening sites?
ps- I'm also open to comments about why this is impossible or impractical.
You could perform a Google search (or the equivalent via API) for any pages that link to your page. This is done with the link: keyword. So if you're trying to figure out how many people link to www.example.com (regardless of whether it's through a link shortner URL), then you would just do a Google search for link:www.example.com.
e.g.: http://www.google.com/search?q=link:www.example.com
Note that this will only find pages that have been indexed, so pages that haven't been crawled, or pages that get crawled infrequently, will not show up in the results until a later date (if at all).
Since all sites have different algorithms for shortening the URLs, and these are different sites that most likely do not share their data with each other, how can you hope to find all of them in a single or small number of queries?
All you can do is brute-force it, and even then this might not be any good if a site is content to create a new value for the same long-form URL (especially if you send a different long-form URL that maps to the same place, like http://www.stackoverflow.com/ rather than http://stackoverflow.com/).
In order to really get this to work, there would have to be a site that ALREADY automatically collects all of this information from every site, which the URL shortening sites voluntarily call. And even if you wrote such a site, that doesn't account for the URL-shortening sites already out there who already have data!
In short, I do not see how this is remotely possible, unless I'm wrong about there being such a database somewhere out there.
So months after asking this question I came across a solution to a similar question, that is how to tell how many times a link has been shared on facebook. The solution, via a simple new API call:
http://graph.facebook.com/http://stackoverflow.com
returns the following json data:
{
"id": "http://stackoverflow.com",
"shares": 1627
}