Is it a good idea to submit sitemap?

hello everone.
i have porn sharing website…so i wanted to know that should i share sitemap to google as it’s the best way to generate organic traffic but only issue is google have verified my google account with my mobile number…will it be issue?

seo – Disallowing a handler in robots.txt while adding its dynamic URLs to the XML sitemap

I’m using ASP.NET webforms, and I have a page lets call it Subjects.aspx, I don’t want crawlers to crawl that page but I want them to crawl the dynamic URLs that are powered by it. For example /subjects/{id}/{title} which routes to subjects.aspx.

I used a crawling tool and the page /Subjects.aspx was found. Is it okay that i disallow that page in robots.txx like the following:

user-agent: *
disallow: /subjects.aspx/

while adding the dynamic URLs in sitemap?

seo – Disallowing a page while it adding it in the sitemap

I’m using ASP.NET webforms, and i have a page lets call it Subjects.aspx, i dont want crawlers to crawl that page but i want them to crawl its route map for example /subjects/{id}/{title} which routes to subjects.aspx.

i used a crawling tool and the page /Subjects.aspx was found. is it okay that i disallow that page in robots.txx like the following:

user-agent: *
disallow: /subjects.aspx/

while adding the dynamic routes in sitemap?

Any suggestions? Thank you!

Google is indexing URLs with parameters that are disallowed in robots.txt despite canonical URLs without parameters listed in the sitemap

All of my webpages are showing ?mode=grid & ?mode=list in the google coverage. But submitted sitemap shows normal URLs. For example:

  3. —> [url in sitemap]

And the robots.txt has a command Disallow: /*? which has led to blocking of all webpages from index. I don’t want to remove the disallow command. How can I get the webpages indexed, removing command will show ?mode=grid & ?mode=list in google searches. Also this is a WordPress website.

seo – Can I have separate url in internal navigation of website and Sitemap?

I have not so SEO friendly url structure in my internal navigation of website(some technical challenges) but marked each of the page with SEO friendly canonical(through CMS) and these canonicals are there in sitemap as well

To give an example, from

I go to –


which is marked as canonical of


and “www.example/friendliness/123” is there in sitemap as well.

So my question is, Is this sufficient or do I need change internal linking (www.example/123 in above example) as well?

Anyways I have seen people marking url’s as canonical to ignore query parameters so it should behave in the same way

How to create Sitemap for a wordpress site

I changed the domain name of my website and I want to index my new domain in google. When I trying to create a sitemap for my website with a new domain then I got a 404 error. Please help me to overcome from this problem.

seo – Google Search Console cannot read my XML: Sitemap appears to be an HTML page

I’m working on a web application written with AngularJS (v8) and deployed on an apache2 using proxy to forward requests (frontend, api, backoffice).

My problem is that I’m trying to submit the sitemap ({website}/sitemap.xml) on Google, but Google Search Console keep saying that it’s not valid: Google can read the link but it seem to be in HTML


My sitemap:

I tried to validate that XML on many website and I didn’t find any error.

I mentioned apache2 because maybe when Google try to fetch the URL, before finding the XML, apache give another page but I cannot prove that. I tried in many ways and the first page that I see when opening the URL is the sitemap and nothing else.

In my angular.json I added the file in the assets as follow:

"assets": ("src/favicon.ico", "src/assets", "src/sitemap.xml"),

What it can be?

Thank you

seo – Sitemap: Should I dynamically update sitemap for dynamic content or create a page containing all the dynamic links

You should create an XML sitemap and dynamically add new dynamic URLs to it. There is now need to use lastmod in an XML sitemap. Google says they ignore lastmod because few sites keep it up to date. Googlebot will notice any new URLs in the XML sitemap and will come crawl them whether on to they have a lastmod specified.

You should not create a single page that links to all your dynamic URLs. HTML sitemaps don’t work well for SEO anymore. XML sitemaps work fine for getting your content crawled.

When you have an XML sitemap Googlebot will come crawl all your content. However, it is likely that most of the URLs in it won’t get indexed. If any of them do get indexed, they won’t rank well. This is called The Sitemap Paradox. To get most of your dynamic URLs indexed and ranking, they need to link to each other. That is why this site has the “Related Questions” section on each question page.

Your home page should link to a few of your best and most recent URLs. Those URLs will in turn pass some of that link juice on to other dynamic URLs.

Once some of your URLs get external links, the link juice from external links will be passed further to other dynamic URLs as well.

seo – Sitemap: Should I dynamicly update sitemap for dyamic content or create a page containing all the dyamic links

Say i have the following route http://<my-domain>/{category}/subjects/{id}/{title}

the ones in the brackets are dynamic, I’m struggling with what is better or any better way to let google crawl through all these dynamic links

Approach 1: manually doing the job by removing or adding the record to site map and updating <lastmod>

Approach 2: create a page that includes all those links and reference that page in sitemap.xml

The second approach can be generated as a plain html file which is generated from the server app.
Or, a simple webform aspx page that dynamically generates those links without having to create an html file.

Any suggestions would be really appreciated, thanks!

seo – Sitemap for one website in multiple domains

I have 2 domains (A and B). The first one (A) is the website.

In sitemap.xml of “A” domain, I have a sub-sitemap in “B” domain with “A” URLs.

I have follow the documentation ( So these 2 domains are verified in the Google Search Console but Google does not index the sub-sitemap (in “B” domain with “A” URLs). Other sub-sitemap in “A” domain are OK.

FYI: the sitemap is valid because if I send it manually in the Search Console (on the B domain property), URLs is displayed in Google search results. An for multiple reasons, I can’t sent it manually every time.

Do you have an idea ? Thanks