Imagine that I have this blog / e-commerce website with 1000 publications / products. And I created a site map for him, which is dynamically generated. It's basically a list with a lot of
I'm pretty sure the trackers expect me to update the
dates for any product or blog post that you edit and change the content of the text (or change the images). Add something new, update information, etc. Basically, anything that users GO differently when they enter my page. This makes sense.
But my question is:
I have a dynamic single page website. So I don't save static pages. I generate and render them (server side) at runtime. So what happens if I decide that all my blog posts should now be displayed within a
div? Or what happens if I add some structured metadata to add prices and review the properties of my products, or if I add structured data for bread crumbs.
See what I mean? The content that the user sees has not changed. But I have updated some labels that CRAWLER will interpret differently. The text / image content is the same, but the HTML content has changed. And this could even have an impact on my ranking, since I am adding new tags that could improve my SEO.
But now, what should I do? The changes I made now will show the 1000 posts / products in a different way with the new tags (in the tracker's perspective). Should I update the
tag ALL my 1000 URLs on my sitemap? The user will continue to see the same text / image content and will not notice any difference.
If I update all 1000
tags, doesn't the crawler think it's "weird" that now all my URLs have been updated the same day? Since everyone will have the same
Tags Does it make sense?
Please, any help is appreciated.