SEO Indexing / Visibility - robots.txt error

Things I have tried

Found problem with ‘Unknown directive’ when examining robots.txt site from google: https://pagespeed.web.dev/report?url=https%3A%2F%2Fprojectbubbleburst.com%2F&form_factor=desktop

Initially site was password protect and option for crawling disabled.

Subsequently, it has been made public and crawling enabled.

Have toggled the crawling switch again recently to no avail. Unknown directive problem still shows.

The robots.txt file doesn’t resemble a standard file (as per https://forum.obsidian.md/robots.txt for example)

Of course, the second I posted this, I realised I could probably create my own to overwrite any auto-generated. Will see what happens and get back…

What I’m trying to do

Enable search engine crawlers to work.

Still no joy with creating a robots.txt file locally.

Publish does not seem to support txt files - nor others as per Support all file types for the Publish service - #6 by jparadie

In addition, I can see a sitemap file (sitemap.xml off the site root directory) is automatically created.

Any way to get better control of these files?

1 Like

I am having a similar problem; google complaining it can’t index the site. Did you ever find a solution?

Apologies for the tardy reply (missed notifications) - no is the short answer. Been distracted anyhoo with the ongoing insanity that is this world!

I see the team are working on SEO improvements in the Trello roadmap so fingers crossed a sitemap feature will be forthcoming.

1 Like

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.