Sitemap best practices for multilingual websites: should I list my URLs in each language or is it enough to use rel = "alternate" + hreflang?

I am working on the site maps of a multilingual website and I have doubts about the best practices to refer to each language version of a page.

For a little background, the website refers to around 20,000 places with community comments and descriptions. The website is available in 5 languages ​​(; …)

At the moment, my site map only refers to pages in English and on the site map for each page that I specify for each language (as well as English) as recommended by Google.

In Google Search Console, I see that approximately 75% of pages with valid coverage are described as "indexed, not submitted on Sitemap", which makes me think that the alternative link with the hreflang attribute is not enough to "send" the Google page to index it.

Should I list the pages in the 5 languages ​​on my site map and use them too? in each link?

How to create a 32 million page XML sitemap for the HTML website [on hold]

I have created an HTML website with 32 crore pages. This is a flight booking website. Now I want to create an xml site map for this and implement

If my sitemap and robots.txt have http or https URL for the given scenario

Battery exchange network

The Stack Exchange network consists of 175 question and answer communities, including Stack Overflow, the largest and most reliable online community for developers to learn, share their knowledge and develop their careers.

Visit Stack Exchange

seo – How to change the structure of my URLs together with the Sitemap for a live website?

I am changing the URL of my website to improve SEO. The current site map contains URL like this:

mysite/browse/1    /* 1 is the ID for fashion */
mysite/browse/2    /* 2 is the ID for real estate */

Now I have changed the URLs to the following format:


I have the code ready to be launched along with the new site map. If I release the new code, the old URLs would no longer be valid, that is: mysite/browse/2 would return "404 – page not found".

I think I have to release the code and send the new site map to Google Search Console. But I think it would take Google some time before crawling the new sitemap, so during this time, all my old URLs that appear in the search result will return the "404 – page not found" error.

How can I mitigate this situation?

seo – Hierarchical vs Flat sitemap

I want to create a site map for my e-commerce website.

On the home page, there is a multi-level menu like this:

enter the description of the image here

The user can click on any of the following links in the Homepage:

  1. fashion: enter the fashion department
  2. fashion > * Women: enter the Womanlow sub-department fashion
  3. fashion > woman > Shoes -> enter the shoe unit under the women

There are also link pages. Help Y About us that use can navigate from Home Page.

Regarding SEO, what is the best way to create a site map?

enter the description of the image here

So the user has 2 ways to navigate to WomanThe sub-department, either directly through Home page or browsing from Start to fashion and then to Woman page.

seo – GSC: Sitemap could not be recovered

I am trying to send a very simple site map (for testing only) to the Google Search Console but, unfortunately, I am constantly receiving the following error message:

╔══════════════╤═════════╤══════════════╤═════════ ══╤══════════════════╤═════════════════╗
║ Site map │ Type │ Sent │ Last read │ State │ Uncovered URLs
╠══════════════╪═════════╪══════════════╪═════════ ══╪══════════════════╪═════════════════╣
║ /sitemap.txt │ Unknown │ July 17, 2019 │ │ * Could not be recovered * │ 0 ║
╚══════════════╧═════════╧══════════════╧═════════ ══╧══════════════════╧═════════════════╝

When you click on it, an additional error message appears: "(!) Could not read the site map."
However, if you click on "OPEN SITEMAP", it opens normally.

Any idea of ​​what is happening?

Sitemap file: sitemap.txt
Server: Apache (debian)

Grabber simple Youtube No Api Key & Auto SITEMAP |

Features :
Totally sensitive and clean design
Mobile & amp; Desk
Content of Auto Grab (AGC)
Top Charts iTunes
Search on youtube
Result of the YouTube video on the search page
Recent search
You do not need the Youtube v3 API key (no limit)
Does not need DB
DMCA page, Privacy page, Contact page
Block wrong keyword
ANNOUNCEMENTS: Top Banner, Bottom Banner, Popup
Automatic Sitemap, inject keyword with TXT
Download Format Video & amp; MP3
Fast charge

Apache / Nginx server (mod rewrite enabled)
PHP 5.6+ (php curl enabled)


Upload the script to the root directory.
Until this step, the script is already in use.
If you want to change the
Site name, email, direct link ads, and so on, open Readme.txt



Simple AGC MP3 No Api Key & Auto SITEMAP |

Simple AGC MP3 No Api Key & Auto SITEMAP

Features :
Simple, clean and sensitive design
Mobile and Desktop
Content of Auto Grab (AGC)
Without database
Without API
Fast charge
Itunes best songs
Search result of Youtube
Inject keywords
Error filtering
Clean and easy to use code
Easy to carry

Apache / Nginx server (mod rewrite enabled)
PHP 5.6+ (php curl enabled)


Upload the script to the root directory.
Until this step, the script is already in use. If you want to change the
"Site name, email, direct link ads and etc., simply edit"config_edit.php"file in the root directory." The file contains all the configurations with description.



seo – Should I use an xml sitemap instead of a txt for a site with deeply nested product pages?


I have a B2B spare parts website with around:

  • 25 main categories (organized hierarchically)
  • 150 leaf categories (models)
  • 250 products (unique items, each with quantity = 1)

Addressed visitors are looking for a specific spare part.
Typically, they will not hesitate between several brands and products as in the consumer segment.

The website is aimed at specialists (niche market).

Despite several optimizations, the website is still badly referenced In the search results, compared to those of competitors.

I must admit that I am not a fan of social networks, so there are only some links to the site, which come from specialized forums.

The publication of many products on the home page can help make reference to the site, but it would also create duplicate content with the dedicated product pages.

In this thread, the general consensus is that there is no disadvantage in using a txt site map instead of an xml one. However, I am not sure of this in the context where the pages to index are buried deep in the hierarchy and the search engines ignore the intermediate levels.

How pages are currently indexed

Google was able to index the pages for the categories of sheets and products, which were provided through two text site maps (URL lists):

Sitemap with leaf categories:

Sitemap with products:

Most of the products are accessed through search field, where the visitor enters the model that he wishes to acquire for the spare part (s). The name of the model is used as Friendly URL Y .htaccess The file redirects directly to the page category page.

# Currently there are no friendly URLs for intermediate categories (branch).

# Friendly URL for leaf categories (Models)
RewriteRule ^ A_model $ /index.php?cmd=category&cat_id=123 [L]
RewriteRule ^ Other_modelo $ /index.php?cmd=category&cat_id=124 [L]

On the category pages there are links to the unique spare parts.

Friendly URLs are also used and redirection is done with .htaccess case file.

# Friendly URL for unique products
RewriteRule ^ A_product $ /index.php?cmd=products&prod_id=456
RewriteRule ^ Other_product $ /index.php?cmd=products&prod_id=789

For the user's convenience, if there is only one spare part available for a given model, there is an automatic redirect from the page category page to the single product page, so the category address acts as a small URL (or a gateway if you prefer) to the product page.

If the visitor wants Browse the categories, he can do it though ajaxified tree whose expanded nodes are loading the subcategories on the fly. (For this, the website uses dynatree.js with lazy loading).

So, The robots are aware of the relevant landing pages. for sale (leaf categories and product pages) but, because they do not have a map of the XML site, The site may seem unstructured. (They do not know hierarchical structure).

Why I used .txt sitemaps instead of .xml so far:

  • Easier maintenance: I simply have to add a new link when a new product or category is published
  • Targeted visitors are experts in their field,
    that from the beginning they know what model / piece they are looking for.
  • The intermediate categories (tree branches) are almost irrelevant, apart from
    See the various families of available products, and therefore it is not necessary to refer to them.


  1. Should I create friendly URLs for intermediate categories and add
    them to the site map
    in order to make the site more structured, given
    that these pages would create some duplicate content with the sheet
    categories and product pages?
  2. In this particular case, should change from .txt sitemaps to xml? (Although maintenance would be much more difficult).
  3. I plan to replace the ajaxified tree with ajaxified navigation based on tags (filters). Would this make the reference even worse?
  4. As the home page is more or less similar to a search engine (ie, with little content), would you recommend adding a little "blah blah blah" (even if it's useless for the visitor) to attract more traffic?

How to configure sitemap for a mediawiki website?

I am referring mainly to this line:

If you want to see a human-readable site map, allow read access for
Sitemap.xsl file in the configuration of your site (file .htacces or other)

That is related to a mediawiki extension:

I have installed the extension, but I want to know how I can see the contents of the site map file.