Does Go sanitize URLs for web requests? -


i implementing simple web server in go. have no experience in web development, striked serious question me.

let's i'm serving web pages modified loadpage function here

func loadpage(title string) []byte {     filename := title      body, _ := ioutil.readfile(filename)     return body }  func handler(w http.responsewriter, req *http.request) {     content := loadpage(req.url.path[1:])     fmt.fprintf(w, "%s", content) } 

technically allows me write request in form of

 http://example.com/../../etc/passwd 

and code happily serve /etc/passwd file, not. mean there sort of protection against ../ in go http package or http protocol itself, or doing wrong , security hole?

net/http in http request multiplexer, servemux:

servemux takes care of sanitizing url request path, redirecting request containing . or .. elements equivalent .- , ..-free url.

the relevant function private func cleanpath(p string) string, calls path.clean:

1415        np := path.clean(p) 

path.clean does appropriate removals:

 97         case path[r] == '.' && path[r+1] == '.' && (r+2 == n || path[r+2] == '/'):  98             // .. element: remove last /  99             r += 2 100             switch { 101             case out.w > dotdot: 102                 // can backtrack 103                 out.w-- 104                 out.w > dotdot && out.index(out.w) != '/' { 105                     out.w-- 106                 } 

there's additional case if path isn't rooted, cleanpath above ensures so, prepending forward-slash path cleaned if there isn't 1 already.


Comments

Popular posts from this blog

java - Intellij Synchronizing output directories .. -

git - Initial Commit: "fatal: could not create leading directories of ..." -