Publish: let the user provide a custom robot.txt (Block crawling, LLM Content protection)

Hello! With OpenAI’s announcement that they will now allow us to disallow their crawler from ingesting our websites, has there been any progress on allowing us to utilize this block on our Obsidian Publish websites? Thank you!