Answer the question
In order to leave comments, you need to log in
How will the robot of Yandex and Google like it if I generate sitemaps every time, regardless of changes?
I'm going to use spatie sitemaps to generate sitemaps in the stall. The simplest use case involves a full site scan and the output is a sitepam file. So, in order not to be smart, I think to run this scan twice a day, thus I will always have a fresh sitemap. The sitemap has a modification date for each page, with this approach, all dates will be constantly updated, regardless of whether there were changes on the pages or not. Question: How do robots feel about this ?, is this not regarded as an obsessive attempt to re-index the page without changes, or is it better not to do this?
Answer the question
In order to leave comments, you need to log in
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question