I have a website which has only one language, English.
In Google Analytics, I get two different types of URLs, which makes my results harder to analyse:
"/screen/page/obd-ii-pid-examples/language/en"
"/screen/page/obd-ii-pid-examples"
I would like to somehow "aggregate"/bundle these together so e.g. #hits becomes the sum of the two types across my various URLs.
Is this possible somehow?
Best,
Martin
Easiest thing to do would be to apply some filters to the view you are using
https://support.google.com/analytics/answer/1033162?hl=en
A search and replace filter on /language/en would do the trick.
Related
Microsoft graph will provide you with “#odata.nextLink”.
How can I get the “previousLink”. ?
This is a design choice, there is no previous link. When you are enumerating a collection, you always page the collection from the beginning and until the end or until you've found the item you are looking for. If you need a different ordering for various reasons, you should leverage the $orderby command, but those two capabilities (paging and ordering) should be considered distinct, they do not serve the same purpose
I just wanted to give my two cents here and say I am just as confused as you are as to why there is no previous page link.
It's a standard behavior in paginated lists to be able to go forward and back. This really makes no sense at all.
I wanted to know the technology decision behind the iOS Google app.
As we can see, in the app's Google Now feature it renders many different card templates for different scenarios, and those templates seems to be very flexible based on server inputs.
I was wondering if this is implemented all based on HTML5? or they just have many templates built in and render them locally? I'd vote for the HTML5 route but not sure if this still involved some native code to make it more responsive?
Thanks!
As we (well, most of the community) are not Google employees we can't tell you what they really did, but I'd say that it is possible to do this dynamically in the app.
We did develop something similar that responds to definitions sent by the server and transforms them to custom designed forms following basic rules.
Google reuses the design of those cards for different plattforms, the easiest solution should be showing some WebView and using HTML5.
I agree with Kevin, as this answer is entirely based on personal opinion, too.
The way I would go is to create a card class which will load some JSON data and format it with HTML and CSS. Looking at each card it would be hell to format things that way natively. I mean, attributed strings is not the way to go. Too much logic for deciding which card get a bigger text or a picture.
Additionally, the top header is most likely "localized" as well, so you get the location and load a localized image. But that is Google by nature.
I'm not sure whether this question belongs to StackOverflow, SuperUser, or yet another StackExchange site.
I have a list of locations (prospects I need to meet) I have to drive to, and I'd like to use either Google Maps or Open Street View to build an optimized route.
Are there tools that can do this, eg. they take a CSV list of locations and generate a driving route?
Thank you.
If your list is short enough, you could build a custom link with all the locations which might be easier than the AutoIT approach. Look at the link that allows you to share a map with several route stops, and then use that syntax to chain your locations together. I think that would be easier than pasting them into the UI which recently changed.
Is there a way to collect web content in order to use it in a search engine without passing by the web crawling phase? Any alternative to web crawling?
Thanks
No, to collect the content you have to...collect the content. :-)
Yes (and sort-of no).
:)
You can download existing data dumps from various websites (wikipedia, stackoverflow, etc.) and construct a partial index that way. It obviously won't be a complete index of the internet.
You could also use meta-search to construct your search engine. This is where you use the APIs of other search engines and use THEIR search results as the basis of your index. Examples include citosearch and opensearch. duckduckgo uses yahoo's boss api (and now yahoo uses bing...) as part of their search engine.
There are also real-time streaming APIs that you could use instead of crawling the web. Look at datasift as an example. There are lots more resources you could cleverly use and avoid/minimize crawling.
If you want to be updated with the latest content on pages, then you can use something like pubsubhubbub protocol to get push notifications for subscribed links.
Or use paid services like superfeedr that make use of the same protocol.
directly or indirectly you have to crawl the web in order to get the content.
Well if you don't want to crawl, you can follow a wiki-like approach, where users can submit links to sites (with title, description and tags). So a collaborative link collection can be built.
To avoid spam a +/- system can be involved, to vote useful sites or tags up and useless ones down.
To avoid spammers mass voting SERPs you can weight votes by user reputation.
User reputation can be gained by submitting useful sites. Or somehow tracing usage patterns.
And considering other abuse patterns too.
Well, you got the point, I think.
As spammers gradually discover weaknesses of traditional search engines (see Google bomb, content scraper sites, etc.), a community based approach may work. But it would suffer severely from the cold start effect, and when community is small the system is easy to abuse and poison...
At least Wikipedia and Stack Exchange is not spammed to useless levels so far...
PS: http://xkcd.com/810/
Searching with mutltiple Parameters
In my app I would like to allow the user to do complex searches based on several parameters, using a simple syntax similar to the GMail functionality when a user can search for "in:inbox is:unread" etc.
However, GMail does a POST with this information and I would like the form to be a GET so that the information is in the URL of the search results page.
Therefore I need the parameters to be formatted in the URL.
Requirements:
Keep the URL as clean as possible
Avoid the use of invalid URL chars such as square brackets
Allow lots of search functionality
Have the ability to add more functions later.
I know StackOverflow allows the user to search by multiple tags in this way:
https://stackoverflow.com/questions/tagged/c+sql
However, I'd like to also allow users to search with multiple additional parameters.
Initial Design
My design is currently to do use URLs such as these:
http://example.com/search/tagged/c+sql/searchterm/transactions
http://example.com/search/searchterm/transactions
http://example.com/search/tagged/c+sql
http://example.com/search/tagged/c+sql/not-tagged/java
http://example.com/search/tagged/c+sql/created/yesterday
http://example.com/search/created_by/user1234
I intend to parse the URL after the search parameter, then decide how to construct my search query.
Has anyone seen URL parameters like this implemented well on a website?
If so, which do it best?
What you have here isn't a bad start.
Some things to keep in mind is that there is a length restriction on urls ~2000 characters in IE. Keep this in mind in the battle between SEO and readability vs brevity.
I'm not aware of any standards in this arena outside of common sense which it appears you've captured.
Another thing to keep in mind is that most search engines use standard url params e.g. ?http://www.google.com/search?hl=en&source=hp&q=donkeys+for+sale&aq=f&aqi=g10&aql=&oq=&gs_rfai=
There is good reason for this namely to do with url encoding and allowing for not traditional characters in the search bar.
So while pretty urls are nice they fail here for a variety of reasons