Proposed Security.txt will work like Robots.txt
Ed Foudil, a web developer and security researcher, has submitted a draft to the IETF — Internet Engineering Task Force — seeking the standardization of security.txt, a file that webmasters can host on their domain root and describe the site’s security policies. The file is akin to robots.txt, a standard used by websites to communicate and define policies for web and search engine crawlers....