seo – Hierarchical vs Flat sitemap

I want to create a site map for my e-commerce website.

On the home page, there is a multi-level menu like this:

enter the description of the image here

The user can click on any of the following links in the Homepage:

  1. fashion: enter the fashion department
  2. fashion > * Women: enter the Womanlow sub-department fashion
  3. fashion > woman > Shoes -> enter the shoe unit under the women

There are also link pages. Help Y About us that use can navigate from Home Page.

Regarding SEO, what is the best way to create a site map?

enter the description of the image here

So the user has 2 ways to navigate to WomanThe sub-department, either directly through Home page or browsing from Start to fashion and then to Woman page.

seo – GSC: Sitemap could not be recovered

I am trying to send a very simple site map (for testing only) to the Google Search Console but, unfortunately, I am constantly receiving the following error message:

╔══════════════╤═════════╤══════════════╤═════════ ══╤══════════════════╤═════════════════╗
║ Site map │ Type │ Sent │ Last read │ State │ Uncovered URLs
╠══════════════╪═════════╪══════════════╪═════════ ══╪══════════════════╪═════════════════╣
║ /sitemap.txt │ Unknown │ July 17, 2019 │ │ * Could not be recovered * │ 0 ║
╚══════════════╧═════════╧══════════════╧═════════ ══╧══════════════════╧═════════════════╝

When you click on it, an additional error message appears: "(!) Could not read the site map."
However, if you click on "OPEN SITEMAP", it opens normally.

Question
Any idea of ​​what is happening?


Domain: world-hello.ddns.net
Sitemap file: sitemap.txt
Server: Apache (debian)

Grabber simple Youtube No Api Key & Auto SITEMAP | Proxies123.com

Features :
Totally sensitive and clean design
Mobile & amp; Desk
Content of Auto Grab (AGC)
Top Charts iTunes
Search on youtube
Result of the YouTube video on the search page
Recent search
You do not need the Youtube v3 API key (no limit)
Does not need DB
DMCA page, Privacy page, Contact page
Block wrong keyword
ANNOUNCEMENTS: Top Banner, Bottom Banner, Popup
Automatic Sitemap, inject keyword with TXT
Download Format Video & amp; MP3
Fast charge

Requirements:
Apache / Nginx server (mod rewrite enabled)
PHP 5.6+ (php curl enabled)

Demonstration: https://freevideo.sch.gdn

Installation:
Upload the script to the root directory.
Until this step, the script is already in use.
If you want to change the
Site name, email, direct link ads, and so on, open Readme.txt

TO DOWNLOAD

.

Simple AGC MP3 No Api Key & Auto SITEMAP | Proxies123.com

Simple AGC MP3 No Api Key & Auto SITEMAP

Features :
Simple, clean and sensitive design
Mobile and Desktop
Content of Auto Grab (AGC)
Without database
Without API
Fast charge
Itunes best songs
Search result of Youtube
Inject keywords
Error filtering
Clean and easy to use code
Easy to carry
ANNOUNCEMENTS: Up, Down, Popup

Requirements:
Apache / Nginx server (mod rewrite enabled)
PHP 5.6+ (php curl enabled)

Demonstration: https://mp3.csyoutube.com

Installation:
Upload the script to the root directory.
Until this step, the script is already in use. If you want to change the
"Site name, email, direct link ads and etc., simply edit"config_edit.php"file in the root directory." The file contains all the configurations with description.

TO DOWNLOAD

.

seo – Should I use an xml sitemap instead of a txt for a site with deeply nested product pages?

Context

I have a B2B spare parts website with around:

  • 25 main categories (organized hierarchically)
  • 150 leaf categories (models)
  • 250 products (unique items, each with quantity = 1)

Addressed visitors are looking for a specific spare part.
Typically, they will not hesitate between several brands and products as in the consumer segment.

The website is aimed at specialists (niche market).

Despite several optimizations, the website is still badly referenced In the search results, compared to those of competitors.

I must admit that I am not a fan of social networks, so there are only some links to the site, which come from specialized forums.

The publication of many products on the home page can help make reference to the site, but it would also create duplicate content with the dedicated product pages.

In this thread, the general consensus is that there is no disadvantage in using a txt site map instead of an xml one. However, I am not sure of this in the context where the pages to index are buried deep in the hierarchy and the search engines ignore the intermediate levels.


How pages are currently indexed

Google was able to index the pages for the categories of sheets and products, which were provided through two text site maps (URL lists):

Sitemap with leaf categories:

https://example.com/A_model
https://example.com/Another_model
(...)

Sitemap with products:

https://example.com/A_product
https://example.com/Another_product

Most of the products are accessed through search field, where the visitor enters the model that he wishes to acquire for the spare part (s). The name of the model is used as Friendly URL Y .htaccess The file redirects directly to the page category page.

# Currently there are no friendly URLs for intermediate categories (branch).

# Friendly URL for leaf categories (Models)
RewriteRule ^ A_model $ /index.php?cmd=category&cat_id=123 [L]
RewriteRule ^ Other_modelo $ /index.php?cmd=category&cat_id=124 [L]

On the category pages there are links to the unique spare parts.

Friendly URLs are also used and redirection is done with .htaccess case file.

# Friendly URL for unique products
RewriteRule ^ A_product $ /index.php?cmd=products&prod_id=456
RewriteRule ^ Other_product $ /index.php?cmd=products&prod_id=789

For the user's convenience, if there is only one spare part available for a given model, there is an automatic redirect from the page category page to the single product page, so the category address acts as a small URL (or a gateway if you prefer) to the product page.


If the visitor wants Browse the categories, he can do it though ajaxified tree whose expanded nodes are loading the subcategories on the fly. (For this, the website uses dynatree.js with lazy loading).

So, The robots are aware of the relevant landing pages. for sale (leaf categories and product pages) but, because they do not have a map of the XML site, The site may seem unstructured. (They do not know hierarchical structure).


Why I used .txt sitemaps instead of .xml so far:

  • Easier maintenance: I simply have to add a new link when a new product or category is published
  • Targeted visitors are experts in their field,
    that from the beginning they know what model / piece they are looking for.
  • The intermediate categories (tree branches) are almost irrelevant, apart from
    See the various families of available products, and therefore it is not necessary to refer to them.

Questions:

  1. Should I create friendly URLs for intermediate categories and add
    them to the site map
    in order to make the site more structured, given
    that these pages would create some duplicate content with the sheet
    categories and product pages?
  2. In this particular case, should change from .txt sitemaps to xml? (Although maintenance would be much more difficult).
  3. I plan to replace the ajaxified tree with ajaxified navigation based on tags (filters). Would this make the reference even worse?
  4. As the home page is more or less similar to a search engine (ie, with little content), would you recommend adding a little "blah blah blah" (even if it's useless for the visitor) to attract more traffic?

How to configure sitemap for a mediawiki website?

I am referring mainly to this line:

If you want to see a human-readable site map, allow read access for
Sitemap.xsl file in the configuration of your site (file .htacces or other)

That is related to a mediawiki extension:

https://www.mediawiki.org/wiki/Extension:AutoSitemap

I have installed the extension, but I want to know how I can see the contents of the site map file.

Information architecture: Are the tabs and / or steps in an assistant displayed as separate tables in a sitemap diagram?

EDIT

If this is for developers, each interaction should be taken into account: from the tabs to the manners, to the divs revealed to the links to the displacement interactions. And if there are different interactions between the desktop and the mobile or different breakpoints, then these differences must be documented. It should refer to functional flows and documents of use cases / jiras, etc …

This takes a lot of time, so many places only do superficial work.

//EDIT

The site map should be organized in the way that best allows its users to find the information. The site map is not a document for developers, it is for its end users, who presumably are not in the web development business.

As an example, the site map shows that there is a calendar function. You may have a div that shows detailed information about what this calendar function does, but that level of detail probably should not be the center and the front.

web crawlers: if all web pages are indiscriminately linked by taxonomy (categories), do I still need a sitemap for SEO?

I have a mediaWiki website with approximately 350 web pages.

If all web pages, without exception, are indiscriminately linked by taxonomy (categories)And, in addition, there are also other indicators for web pages such as "page of all web pages" or "page of recent changes",
Do I still need a sitemap for SEO?

A problem might arise if I forget to add a category to some page.

seo: use curl to send sitemap but it is not reflected in the web portal

This is TWO things that confused me about how I should send my site map.

First.
It is understood that a site map can be sent with cURL. https://support.google.com/webmasters/answer/183668?hl=en

However, each time the sending of the site map is not reflected in the web portal. The time stamp on the web portal only shows the time at which I send the site map through the web portal.

Because they are different? Is the shipment via curl?

Second
I discovered that Google only crawls my website just after the first time I submit the site map. I update my site and site map regularly, and I hope Google can crawl my site. However, it seems that Google will never come back.

What is wrong with these?

seo – Why does the impression fall after sending the sitemap?

We recently launched a new website in the last month, and we are using the Google search console to investigate how to improve traffic.

We discovered that there was a big increase after sending the site map to Google, but then there is a big drop in printing.

Can anyone explain why this is the case? We have attached a photo in this post. The red circles indicate the date on which we send the sitemape.

enter the description of the image here