web crawlers – Why are Bing and SEMRushBot ignoring crawl-delay in my robots.txt file?


We host a large number of sites that have a large number of catalog pages. We would like to slow down several bots because the traffic is quite excessive from those bots. Specifically, we get quite a bit of traffic from BingBot and SEMRushBot that we’d like to slow down. The information pages for both bots say that they obey the crawl-delay directive. However, despite changing the crawl delays for both, I am seeing no change in traffic even after several days. Is there something wrong with my file? (I put 60 into the SEMRushBot delay but I read that they only delay for a max of 10 seconds. I have seen no change at all after adding them to the robots.txt file).

User-agent: *
Disallow: /nobots/
Disallow: /products/features/
Disallow: /product/features/
Disallow: /product/reviews/
Disallow: /webservices/ajax/
User-agent: yahoo-mmcrawler
Disallow: /m/
User-agent: MJ12bot
Disallow: /
User-agent: AhrefsBot
Disallow: /
User-agent: SemrushBot
Crawl-delay: 60
User-agent: Bingbot
Crawl-delay: 10
Disallow: /nobots/
Disallow: /products/features/
Disallow: /product/features/
Disallow: /product/reviews/
Disallow: /webservices/ajax/
User-agent: dotbot
Crawl-delay: 1
User-agent: Goodzer
Crawl-delay: 1
User-agent: rogerbot
Crawl-delay: 5
User-agent: Baiduspider
Disallow: /
User-agent: YandexBot
Disallow: /
User-agent: YandexImages
Disallow: /
User-agent: Linguee Bot
Disallow: /
User-agent: Seekport Crawler
Disallow: /
User-agent: GrapeshotCrawler
Crawl-Delay: 1
User-agent: istellabot
Disallow: /
User-agent: SeznamBot
Disallow: /
Sitemap: