Google Search Console cannot find all sitemaps in the sitemap index. How to fix?

I created a valid Sitemap index (/sitemap-dec2019/sitemap_index.xml) for a site as follows:


Each Sitemap Child is a simple list of 49,999 URLs and they are all less than 10MB in size.

I uploaded the sitemap index and all the kids from the sitemap to the server. I double-checked to make sure the sitemap index and all sitemap children are accessible on the server, and are.

However, when I submit the Sitemap index to Google via Google Search Console, I get a success message saying "The Sitemap index was processed successfully", no errors are reported anywhereAnd yet it only lists 9 out of 106 Sitemap Children (and they all start with & # 39; sitemap1 & # 39;):

Sitemap                         Status  Discovered URLs
/sitemap-dec2019/sitemap1.txt   Success 49,999
/sitemap-dec2019/sitemap10.txt  Success 49,999
/sitemap-dec2019/sitemap100.txt Success 49,999
/sitemap-dec2019/sitemap101.txt Success 49,999
/sitemap-dec2019/sitemap102.txt Success 49,999
/sitemap-dec2019/sitemap103.txt Success 49,999
/sitemap-dec2019/sitemap104.txt Success 49,999
/sitemap-dec2019/sitemap105.txt Success 49,999
/sitemap-dec2019/sitemap106.txt Success 49,999
1-9 of 9

There is nothing different in the Sitemap index that causes Google to select only those 9 specific Sitemap children.

In total of URLs discovered, it says 449,991 which turns out to be 49,999 x 9.

When I remove the Sitemap and re-add the Sitemap Index, the same 9 Site Map Children appear on the list, and none of the others appear anywhere.

Considering that there are over 100 Sitemap Children, this means that over 90% are being completely ignored.

How can I fix this so that 100% of the Sitemap's child elements (and their respective URLs) are discovered?

Google search console: is it possible to create a txt index file sitemaps?

I am creating sitemaps using Google Search Console and we are exploring an option to create txt sitemaps.
There are a lot of sitemaps, so using a sitemap index file is basically a requirement, but I didn't find a way to do it in the sitemap protocol and Google SCH.

I tried to create a txt sitemap index file similar to txt sitemaps:


but Google SC treats this file as a sitemap and does not detect the linked sitemap file and its URLs.

Is it possible to create a txt sitemaps index file?

In 2020, is there now a standard approach to describing an international site using XML sitemaps?

I have been writing XML Sitemaps for a decade, but never before have I needed to write an XML Sitemap for a international site.

It occurs to me that there are three clear approaches (and there may be better additional approaches).

Approach 1: Unique XML Sitemap (Primary Language URL only, with the corresponding pages)

Approach 2: unique XML sitemap (everybody Relevant URLs, with corresponding pages)

Approach 3: Multiple XML Sitemaps (relevant URLs listed for each language, with corresponding pages)



Much of the advice available on the web is from the first half of the 2010s.

In 2020, Is there now a standard approach to describe an international site using XML sitemaps?

Google search console: two wordpress with two sitemaps with the same domain name

I have two WordPress installations under a domain name, like this one:

I have created them in Search Console as independent properties, in Tag Manager, each with its own label and container, etc.

I recently realized when I add sitemaps in the ExampleWP2 Search Console, automatically add the same site maps in the ExampleWP1 Search Console too. I don't understand how or why, are the properties linked in any way?

I tried deleting all the site maps, adding them again, and it still happens.

I think you could try to delete all the Site Console search maps in both properties and keep them in each site's robots.txt file, is it a viable option?

In addition, both accounts are linked by the same email of the company … is it possible that this is the cause?

Please, someone advise, thank you.

seo – Genuine Google Bug – Sitemaps

Publishing this here because this publication does not seem to appear in the Google's Community of Webmasters forum (it must have some type of keyword that causes it to be deleted). I want to emphasize that I would put a lot of money into this for being a genuine mistake with Google, not only am I blaming Google for a whim, this has been a continuous problem for months that I have spent dozens of hours trying to fix it.

In a nutshell, there is an error in Google that does not track our site maps. This is specific to one of our sites (we executed 10+). The website URL is

The symptoms:

The website receives thousands of Google visitors every day. I really do not think this can be a priority problem.

There is no problem with the site map. You can find it at Execute it through any tester, etc. – happens.

I have made two previous posts about this in the webmasters forums. I even sent a question to the Google Webmaster Hangout on YouTube (but I did not receive a response after being told to leave a comment).

Where can I report this error? What else can I do? I have tried everything to solve this problem and / or get in touch with someone from Google.

seo – Main website with subdirectory = 2 robots and sitemaps respectively?

I have an website and a subdirectory with (I can consider it as two websites)

The main website is based on the service and is not a WordPress site. So, I created an and manually without using any add-ons and sent it to the webmasters.

And I also created robots and sitemap for the subdirectory that was a WordPress like and with the Yoast plugin and sent to the webmaster.

Can the main site and its subdirectory have separate robots and sitemap or do we need to use only one robot and a sitemap for the main site (or subdirectory)?

Also, instead of these robots from the main site, we can send the site map of the main site and the site map of the subdirectory.

Why is Google Sitemaps important for the business site?

At some point you do not have to pay attention to create the site map of your site and, after days, your site lost a lot of organic traffic, why?

The Google Search Console has the wrong sitemaps in the list

I have a BIG site. Recent when I changed the site from http to https, I regenerated the 20 maps of the site. I put the old ones in an "old_sitemaps" folder and sent the new ones in Google Webmasters Tools.
Today I'm looking at the Google Search Console and I see that the only site maps that it has are the ones it had put in the "old_sitemaps" folder. The correct sitemaps are in the Google Webmaster Tools.
So now I wonder what is the best solution for this. I can just delete the old sitemaps …

The Google Search Console has the wrong sitemaps in the list

How to create sitemaps for a website

Help me … I have a website and a problem. The problem is how do I create sitemap for my site ????

VIMEO and video sitemaps

I'm not entirely sure which VIMEO URL I can use for video maps, as they seem to "redirect" or "forbidden", which would not be acceptable in Google's XML video maps.

How do you create video site maps / what URL do you use for videos hosted on Vimeo?