Today websites are not just used to provide information to people, but they are also used to provide information to large language models. There is no standard to do that correctly.
Proposed solution
Create a LLMs.txt markdown file that provides brief background information and guidance, along with links to markdown files (which can also link to external sites) providing more detailed information. See links below for all the details.
There was another related feature request (one or two years ago) to do just the opposite which is to protect a website from web crawling and scrapping (robot.txt config). So it would be nice to have both options: enable LLM-friendly content by adding a LLMs.txt file or on the contrary (try to) block it by setting the right parameter in the robot.txt file.