in my recent project, i hit
a common roadblock when trying to get our new sitemap indexed. turns out there was an issue in
robots\. txt
. our crawling algorithms kept hitting a snag bc the user-agent wasn't specified correctly.
to fix it:
ensure you have
useragent defined:
user-agent: *
add your sitemap url under `sitemaps` if not already done.
test w/ google search console's robots. txt tester tool.
this tweak resolved our indexing issues and got everything crawling smoothly!