Add robots.txt
This is mainly a SEO thing. This document is almost never altered. Add needed text into a text document and put this file in the "misc" folder. Add something like this in robots.txt
Sitemap: https://static.go4webdev.org/sitemap.xml User-agent: * Allow: /
And tell the path to this file as a regular endpoint:
switch path { case "robots.txt": http.ServeFile(w, r, "public/misc/robots.txt") fmt.Println(path) case "sitemap.xml": http.ServeFile(w, r, "public/misc/sitemap.xml") fmt.Println(path) case "contactme": contactme(w, r) default: setheader(w) exist := tpl.Lookup(page) if exist == nil { redirect(w, r) // if not found -> redirect.go } url := geturl(path) tpl.ExecuteTemplate(w, page, url) ip, agent := GetIPAndUserAgent(r) msg := (ip + " " + agent) stat(msg + " > " + page) }