My Website was hacked with japanese Seo Virus. I have cleaned the virus and started to resubmit the website to google.com.
What is best option here to clean all cached Link and snippet and to start the rindex because the google show all japanase Links in site:url
Google hast 2 Option:
Temporary remove url
Clear Cache Url
How to flash all indexes from google and to force resubmititing?
Thanks in advance!
Log in to the Google Search Console.
Select the resource you want.
Then find the "Remove URLs" subsection in Google Index.
Here we create a new request for deletion, and then enter the desired link in the window that opens and click "Submit".
Also, you can immediately delete the path to these urls if there are a lot of them /your-url/ *
https://support.google.com/webmasters/answer/9689846?hl=en
You can use 301 redirects from old URLs to new ones if necessary.
It would be nice to get advice from SEO specialists, since the situation may be specific, this source can help you.
To speed up the indexing of new urls or information, I would recommend creating the correct structure on the site so that the search robot can find your urls. Also, create and submit a sitemap.xml and add it to Google Search Console
Related
I have a website that has been replaced by another website with a different domain name.
In Google search, I am able to find links to the pages on the old site, and I hope they will not show up in future Google search.
Here is what I did, but I am not sure whether it is correct or enough.
Access to any page on the old website will be immediately redirected to the homepage of the new website. There is no one-to-one page mapping between the two sites. Here is the code for the redirect on the old website:
<meta http-equiv="refresh" content="0;url=http://example.com" >
I went to Google Webmasters site. For the old website, I went to Fetch as Google, clicked "Fetch and Render" and "Reindex".
Really appreciate any input.
A few things you'll want to do here:
You need to use permanent server redirects, not meta refresh. Also I suggest you do provide one-to-one page mapping. It's a better user experience, and large numbers of redirects to root are often interpreted as soft 404s. Consult Google's guide to site migrations for more details.
Rather than Fetch & Render, use Google Search Console's (Webmaster Tools) Change of Address tool. Bing have a similar tool.
A common mistake is blocking crawler access to an retired site. That has the opposite of the intended effect: old URLs need to be accessible to search engines for the redirects to be "seen".
i changed the CMS i use for my website so all links are new . i don’t want any of old URLs in google index , the problem is this old links are pointed from another sites
Send message to external site owners and ask to change the links;
Open Google Search Console and go to "Google Index / Remove URLs"; you can temporarily block old URLs (https://support.google.com/webmasters/topic/4598466?visit_id=1-636513744496718362-859155776&rd=1);
Anyway removed old URL will automatically disapear from Google Index but it takes a long time (months or even year).
a month ago i relaunched a Website in Typo3 CMS. Before that, the site was hosted with Joomla CMS.
In Joomla Config, SEO Links were disabled, so Google indexed the Page Urls this:
www.domain.de/index.php?com_component&itemid=123....
for example.
Now, a month later (after the Typo3 Relaunch), these Links are still visible in Google because the Urls don't return a 404-Error. That's because "index.php" also exists on Typo3 and Typo3 doesnt care about the additional query string/variables - it returns a 200 status code and shows the front page.
In Google Webmaster Tools it's possible to delete single Urls from the Google Index, but that way i have to delete about 10000 Urls manually...
My Question is: Is there a way to remove these old Urls from the Google Index?
Greetings
With this amount of URL's there is only one sensible solution, implement the proper 404 handling in your TYPO3, or even better redirections to same content placed in TYPO3.
You can use TYPO3's handler (search for it in Install Tool > All configuration) it's called pageNotFound_handling, you can use options like REDIRECT for redirecting to some page or even USER_FUNCTION, which allow you to use own PHP script, check the description in the Install Tool.
You can also write a simple condition in TypoScript and check if Joomla typical params exists in the URL - so that easy way you can return custom 404 page. If it's important to you to make more sophisticated condition (for an example, you want to redirect links which previously pointed to some gallery in Joomla, to new gallery in TYPO3) you can make usage of userFunc condition and that would be probably best option for SEO
If these urls contain an acceptable number of common indicators, you could redirect these links with a rule in your virtual host or .htaccess so that google will run into the correct error message.
I wrote a google chrome extension to remove urls in bulk in google webmaster tools. Check it out here: https://github.com/noitcudni/google-webmaster-tools-bulk-url-removal.
Basically, it's a glorified for loop. You put all the urls in a text file. For example,
http://your-domain/link-1
http://your-domain/link-2
Having installed the extension as described in the README, you'll find a new "choose a file" button.
Select the file you just created. The extension reads it in, loops thru all the urls and submits them for removal.
Would be great if you guys could shed some light on this, has baffled me:
I was asked by a client if I could try and make the search term for his comedy night "sketchercise" put his website top of the Google ranking. I simply changed the title tag of the header for the whole site from "Allnutt and Simpson" to "Allnutt and Simpson - Sketchercise # Ginglik - Sketch Duo". It did the trick and now the site comes up top of the Google listing when typing in "sketchercise". However, it gives off this very strange link:
http://www.allnuttandsimpson.com/index.php/videos/
This is the link to the google search result too:
http://www.google.co.uk/search?sourceid=chrome&ie=UTF-8&q=sketchercise
This link is invalid, it doesn't make any sense. I guess it has something to do with the use of hash tags and the AJAX driven site, but before I changed the title tag, it linked to the site fine using the # tags. What is the deal with this slash?
The strangest part is that the valid URL for the videos page on that site is /index.php#vidspics, I have never used the word "videos" in a url!
If anyone can explain the cause of this or just help me stop it from happening, I'd be very grateful. I realise that this is an SEO question and I hate that stuff generally, but I hope you can see this is a bit of a strange case!
Just to compare, if you google "allnutt and simpson" it works just fine links to the site and all of it's pages absolutely fine as .php pages (and then my JS converts them to hash tags to keep things clean)
It's because there must be a folder called 'videos' under your hosted files, use an FTP client and check this.
Google crawls every folder and file unless you tell him not to do this, look for robot.txt files to learn how to avoid indexation.
Also ask google to remove that result when you solve this.
Finally that behaviour is not related with hash tags, these are just references to javascript in order to display the appropiate content in you webpage.
Not sure why its posted like this but the only way to stop that page from appearing is using a google webmaster account for this website and make sure the crawlers can't find this link anymore. The alternative is have the site admin put this tag, <META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW"> , in the header when isset($_REQUEST(videos)) is true.
The slash in the address is the parsed form of www.allnuttandsimpson.com/index.php?=videos. You can have the web server change all the php parameters into slashes to make the links look pretty.
Best option for correct results is to create a sitemap and submit it to https://www.google.com/webmasters/tools/ for that site. You will need access.
Oh forgot, the sitemap will make google see all the pages you want it to post, use this for the major pages like those in the main menu. To remove links you don't want requires a robots.txt in the main directory of the site.
Let say my site has the following URLs indexed in Google:
/test/1
/test/2
/test/3
For some reasons, I want those same pages to have the following URLs:
/test/abc
/test/def
/test/ghi
I noticed that even if I use a 301 redirect from /test/1 to /test/abc, the URL /test/1 stays in the Google index for a while after the robot hits the redirect and discovers the change.
Is it normal that it takes few weeks for the old URLs to disappear from the search engine index or is there a better way to let him know about the changes.
Should I use the URL removal tool ?
Will a new sitemap in the Google webmaster tools help to get rid of the old URLs ?
Help me see inside the Google black box :)
Answering your questions:
Yes it's normal for this process to take a few weeks, this is nothing to worry about.
The URL removal tool is only for URLs that no longer exist, you can't use it for URLs that now return a 301 (see: http://www.google.com/support/webmasters/bin/answer.py?answer=59819&hl=en)
An XML sitemap is mainly for telling Google about new pages and pages that have changed recently, so I don't think it will help you here
In short, the index will update naturally, you just need to let Google do its thing.