Traversing Google searches using LuaSocket - lua

I am trying to make a Lua program that creates a google search, and formats all of the results in a different arrangement. The only problem is that I cannot find a way to do these:
get the html of the search results
search through all of the pages
search with omitted results on

Have you looked into LuaCurl

Related

How to exclude a site or domain from a Google search

I'm trying to search for something on the web, say "cold mountain by charles frazier" (without the quotes) and all the results I'm getting are on books.google.com, and they aren't helpful. So, I want to exclude results from Google Books when I search. Is there some Google-search syntax to search outside a particular domain?
P.S. I know there's the - operator to exclude results containing a particular keyword.
You can search in specific domain with site: and you can exclude specific domain with -site:

Performing Google Search In Spreadsheet And Scraping The SERP Data

I hope that you guys are fine. I want to build a simple spreadsheet and I thought I could be able to make one but blank sheet looks horrible to me. I am sure that you guys are kind enough to help me out.
I want to perform multiple Google search queries in Google spreadsheet and want to parse results of each search (top 10 results of each search)
Something like this: https://www.youtube.com/watch?v=tBwEbuMRFlI
But when I tried his given formula in description to play test, Google returned #Error to me, I don't know why.
Can you guys please help me out in making a simple spreadsheet compatible for multiple queries at once? Like one column for keywords (where I could paste my list of keywords) and then 10 columns of search results. All results for one keyword should come in one row
Something like this:
My 1st Example Query = 1st search result, 2nd search result, 3rd result and so on.
My 2nd Example Query = 1st search result, 2nd search result, 3rd result and so on.
It must be easy to code but yeah, it might be time-consuming and I would be very grateful if anyone of you could help me about it.
Looking forward to your help guys.
The problem is that you want to scrape out of Spreadsheets, that's a bad approach and is almost certainly not going to work. Even if you manage to write a scraper inside that limited environment it will easily be spotted by Google.
As you said time is not a problem, I would suggest another route.
Use a backend tool/script that scrapes the data
Use a backend tool/script that creates/modifies the Google spreadsheet
You can run such a script(s) manually on your PC or from a server full automated using a scheduler/cron job.
To create/modify spreadsheets look here: How do I access the Google Spreadsheets API in PHP?
To scrape Google look here: Is it ok to scrape data from Google results?
So this is PHP as language of choice but you can do the exactly same in Java or Python or C#
There is a third party solution like SerpApi you could use for this. It's a paid API with a free trial.
Google Sheets Add-on: SerpApi - Search Engine Results and Ranks
Example code to extract title from the first result:
=SERPAPI_RESULT("engine=google&q=coffee&location=Austin, Texas, United States&google_domain=google.com&gl=us&hl=en", "organic_results.0.title")

Get all urls indexed to google from a website

I want a program that does that, from a website, get all the urls indexed to it with a good output, like, all the urls line by line, and get the urls not used in the website (because a spider can already do that).
I have been searching and finding sloopy options, what I want is accurate and simple: INPUT: URL OUTPUT: ALL THE URLS.
I don't know such applications for now, but I'll try to simplify your task by dividing it:
Yon need a list of your website's internal links. Any webcrawler tool can do that.
You need a list of your website's pages indexed by Google. There are a lot of SE index checkers, you can google it.
Compare 2nd list to the 1st one, and find all the links presents in Google's index but missing on your website.

ruby on rails google-search gem

my problem is that, I'm trying to check a text area content against the web looking for palgiarized content. So my high level solution for this is to get the content of the text area, enclose them in "double quotes" and do a search in google. And I want to return the top 5 sites that are returned by google.
In order to apply that pseudo code on my app, I installed gem google-search, but when I tried to run my string, the sites returned by the gem have missing items take for example the search "EVE Search - Heavy dict defence?", if you run it on google, it'll return 1 site. But on my app, it doesn't return anything.
Anybody have any ideas?
I believe accessing search from google's API is different than accessing it from your browser. Google returns its search results dynamically based on things likes browser history, your local cache and cookies, etc. Even if you searched for "EVE Search - Heavy dict defence?" from two different browsers, your results could vary.
Also, check out this answer: https://stackoverflow.com/a/654558/1481644
You can go with ruby web-search gem..........it will take a query string to search and return the sites and for more info check here https://github.com/mattetti/ruby-web-search

How can I search documents from google using C# code?

I am currently working on document search in google. I don't want to do HTML parsing at all. However looking for some api which search google documents from internet.
Basically I do have one requirement where there is a task of searching .pdf, .doc files from google search. I have done some googling and found that this cause captcha to be introduce from google and there is limit of 100 query / day.
Is there any free API from google or if not any paid by which i can pass some search query and get the result.
Please note, i don't want HTML parsing at all.
Moreover, Is there any way to overcome the issue of Captch??

Resources