If by cache you mean the pages indexed by Google that remain as copies for example:
Google does not immediately update or index the pages when you do an update on the site, the most it does is to collect the content and schedule to prepare it, it may take weeks or months, it depends on how many visits or the site rank and there is no "control" or force Google to reindexar.
There is no way to state time on average, the Google lives changing, today they do one way and tomorrow can be another.
These "cache" pages you refer to that appear next to the search result take some time to update, as well as with indexing.
Forwarding of old links
If the links that appear in the cache are old and have changed structure, for example:
http://examplo/siteantigo/?pag=produto&id=1
And now it’s accessed like this:
http://examplo/sitenovo/produto/1
A good strategy is to reuse old links already indexed, of course it is not guaranteed, it will depend on the specific link rank within a given site. To reuse and even contribute to pages that represent the same content of specific pages it will be necessary to make a permanent redirection by HTTP, for example create a script like this in siteantigo/index.php
:
<?php
if (isset($_GET['pag'], $_GET['id'])) {
/* Isto direciona para: http://exemplo/sitenovo/{pagina}/{id} */
header('Location: /sitenovo/' . $_GET['pag'] . '/' . $_GET['id'], false, 301);
exit;
}
Of course this is just an example to explain the redirection, there is no formula ready, it will depend on how you did the old site and how you did the new, if they are technologies known as Joomla or Wordpress is likely to exist plugins for this, but this is quite broad.
Note that if you do not redirect old Urls it is likely that in the dashboard Google/webmaster start showing a number of errors Not found or the Soft 404, the Soft 404 occurs when the page emits another code like 200, but the page returns an empty answer, or a page that doesn’t seem to have content (Google can identify even if the page has menus and footer, but doesn’t have a reasonable content or has an error message maybe).
META tag revisit-after
There’s even a tag <meta>
that should be used for this, would be the revisit-after
, for example, visit every 15 days:
<meta name="revisit-after" content="15 days">
You could put in a day, but Google generally disregards it, because a lot of people use it to try to keep up to date and yet the refresh rate was very low or time consuming and it would be bad for them if all the sites in the world wanted 1-day updates, therefore they analyze the traffic made through "searches" and they calculate which need more update instead of using the value of the tag you set.
As per this link https://developer.mozilla.org/pt-PT/docs/Utilizando_meta_tags is actually the use of revisit-after
should be for proxy servers.
Sitemap.xml
You can use the tags:
<priority>
to determine the importance of the page.
<lastmod>
sets the date of last update.
But note that this is no guarantee of speeding up cache indexing or renewal. If you need to force remove a URL you can use the tag <expires>
to remove urls you no longer want, example:
<url>
<loc>http://www.examplo.com/expired.html</loc>
<expires>2011-06-21</expires>
</url>
One of the best ways is to keep the sitemap updated, and use the tools to webmaster google to keep everything up to date. Resending the site can help if Google hasn’t changed the rules of the game again. https://www.google.com/webmasters/tools/submit-url
– Bacco
Thanks I’ll do the google checks.
– Anderson Brunel Modolon