An anonymous reader writes: Ed Foudil, a web developer and security researcher, has submitted a draft to the IETF — Internet Engineering Task Force — seeking the standardization of security.txt, a file that webmasters can host on their domain root and describe the site’s security policies. The file is akin to robots.txt, a standard used by websites to communicate and define policies for web and search engine crawlers…
For example, if a security researcher finds a security vulnerability on a website, he can access the site’s security.txt file for information on how to contact the company and securely report the issue. According to the current security.txt IETF draft, website owners would be able to create security.txt files that look like this:
#This is a comment
Read more of this story at Slashdot.