seo – How to generate sitemap for angular web application

I am developing a website using MEAN stack technology. So I want to create a sitemap for all urls present in webpage and also video and image sitemap. Website is dynamic and there will surely be more links in future and more images and audios as well. So can you help me in creating a dynamic sitemap so that it does not require to update for each and every link, image and video.
You can share your any idea regarding sitemap generation. Because I have zero knowledge in sitemap generation. Any help will be appreciated.

8 – How do I debug/kint Simple XML Sitemap link array?

I am trying to find how I can debug/kint a variable/array from the Simple XML Sitemap?

I worked through the documentation here: https://www.drupal.org/docs/8/modules/simple-xml-sitemap/api-and-extending-the-module#s-api-hooks to find the hook I need.

My goal is to unset any links that have node/ to remove published, but un-aliased nodes of included content types.

The array key ('path') looks to be the unaliased URL and the below code removes all links except the home page. I am unsure how I can kint($link) in this function so I can see what other array keys are available to see what else I may use for comparison.

function HOOK_simple_sitemap_links_alter(array &$links, $sitemap_variant) {

  foreach ($links as $key => $link) {
    if (strpos($link('meta')('path'), 'node/') !== FALSE) {
      unset($links($key));
    }
  }
}

Is there a way to kint() these sitemap arrays? Or maybe some documentation that shows the structure of these arrays?

Do you ping Google when you update your sitemap?

When I update my website, I can see in access logs, that Google downloads my sitemap within 48 hours. If I use something like this – https://submit-sitemap.com/ – I can see in my logs immediately, that Google has downloaded my sitemap.

May I be penalized for pinging Google for pinging them too much?

Google Search Console coverage reports "Submitted URL marked ‘noindex’" for a 404 page not in the sitemap without a noindex tag

Error: "Submitted URL marked ‘noindex’" error in the Google Search Console coverage report

Situation

  • URL is not submitted in our sitemap
  • Google is not indexing the URL
  • There is no noindex tag on the page
  • The URL is not disallowed in robots.txt
  • Can’t find the file on server and it returns a 404 on front-end
  • Resubmitted the sitemap and the error persists

Has anyone encountered this situation? How can this be fixed?

Separate Sitemap for Mobile Website

Hello Friends,

Do we need to create separate sitemap for mobile website. If i have subdomain mobile website m.axydotcom (i.e)

xml sitemap – Can’t view my page map correctly?

I have installed the geolocation module to view the location maps that I have on my website
enter image description here

Also I checked the Google map API , and added google maps API key , after create the location content type and make sure that Manage form display to show me the geolocation google map api – geocoding and map
enter image description here

then I added the content that I want but I got this error (this page can’t load google map correctly)
enter image description here
is there something wrong with the steps that I followed?

sitemap – Problem with how Google indexes multi-language and multi-domain website

I have a problem with how Google indexes multilanguage/multidomain website.

Website itself handles two regions/languages and two domains. Language is identified by url param, not by the domain. Ideally we would want the website open either as website.com/en or website.de/de.

However that is not how Google indexes the site as I can see for example website.de/en or website.com/de in the results (depending how search is done). This is very confusing for the clients as it leads to the wrong version of the site.

What I’m trying to figure out is how to tell google to only index links for one site as website.com/en/* and the other as website.de/de/* and do not create a mix of both, like it does currently.

My initial though was to tweak the sitemap. At the moment it shows all the links existing on both en and de website, but with the .com domain. It looks like this

website.com/en
website.com/de
website.com/en/contact
website.com/de/kontakt
website.com/en/our-team
website.com/de/unser-team

and so on.

My question is. How can this problem be solved? Is a sitemap the best way to go? And if so, how should actually such sitemap look AD 2020 to handle this setup?

Or is there other solution?

Is a video sitemap enough for SEO or should video markup schema also be used?

I already implemented a video sitemap, based on Google’s recommendation. Now, I am reading about video markups. The only information, I got on the relationship of both approaches to each other, is:

Use both on-page markup and video sitemaps. A common scenario is to use on-page markup for all of your videos and a video sitemap to tell Google about any new, time-sensitive, or hard to find videos.

Afaik, the video sitemap is more widely supported among search engine providers, e.g. Bing seems to just start supporting JSON-LD.

Is there any reason that speaks for using video markups in addition to video sitemaps?

Can you request refresh of the sitemap in Google Search Console?

TL;DR My Sitemap shows a last crawl date of 2-3 days ago in Search Console and I wondered if there was a method of requesting Google re-crawl the sitemap, in the same way as you can ask it to re-crawl a page?

Explanation – I have a problem with a updated page not showing correctly in Search results. I have asked for several re-crawls over several days without luck. I believe this may be as a result of an old lastmod tag on my sitemap. I have corrected that issue and now a live view of sitemap now shows a correct lastmod. I think Google was skipping over the page re-crawl request because its last index was newer than the lastmod before I corrected that issue. Now the lastmod is newer than the last crawl date but now its not touching the Sitemap file to see the updated lastmod. I suspect it will in its own time in the next few days but still, its annoying…

Is there a way of requesting a sitemap re-crawl?