On my site i can trigger certain things using GET request like the ability to hide or delete a comment. I am not very worried but it would be pretty annoying if someone design a
You have confused a couple of common issues here.
Firstly, the attack as others have noted is called is a cross-site request forgery. It is possible to cause either GETs or POSTs from another domain and because the request is going to your domain it will pass in the cookies for your domain which include the session details.
To counter this, when a user logs in, generate a token (some random string of characters) that all links and forms on your site pass back during that session. When the request comes in, take the session details from the cookie and look up which token should GETted/POSTed for that session. If the correct token has not been passed then you can ignore the request/inform the user/log detail for further investigation. I'd recommend the last as when implementing this you may well miss a few links or forms which will then not work. Users may simply leave rather than taking the time to inform you of this.
Secondly, GET requests should be safe (i.e. simply cause data to be displayed with no changes made) and POSTs should be used for all data altering requests. Firstly in case a spider manages to follow a link, causing changes that spiders shouldn't be causing. Secondly as a backup to the user refreshing the page - the browser should remind them that they will be resubmitting the request and do they want to continue. I say as a backup because all your requests should be written in such a way that they are harmless/ignored if resubmitted i.e. don't have a button that requests the last item to be deleted, instead look up that the id of the last item is 1423 and have the button request that 1423 is deleted; if this is submitted twice then the second time around your validation should notice that item 1423 is no longer there and cause no further changes.
should i use POST instead? Would POST slow the site down? There are very little cookies so a browser may submit cookies and POST with one packet however i dont know if POST and cookies must be seperate.
Yes, it is better to use POST in your case for lowering the security risk. And don't favor speed over security, go with the POST and yes post and cookie won't clash with each other.
In the end, i would suggest you to go for the html purifier for making your urls and forms safe.
The risk you're discussing is known as a cross-site request forgery attack. The standard way to prevent it is to double-post cookies (once in the cookies, once in the form), or some other unique token that an attacker could not guess via an included image. For more details on detection and prevention, see:
http://www.owasp.org/index.php/Cross-Site_Request_Forgery_(CSRF)
I mostly agree with status203. Apart from what he has said about POST not really helping, a couple of comments:
1) GETs are safe only if the application is written correctly. I have seen applications where GETs are used even to make changes. Secondly on this topic, if you return JSON data as an array and your entry point is not protected against CSRF, on some browsers the attacker may be able to steal victim's data by enticing victim to website that has <script src="http://yourserver/json_rsp_entrypoint"></script> and then overriding the array constructor.
2) Secondly while having something random in the param and then checking with what is stored in session works, this is complicated if you do not have sessions (such as if you have hundreds of servers and don't want to take the hit of querying DB). So, one alternative is to include MD5(session_cookie) as the CSRF token. This allows you to verify without resorting to DB and attacker without XSS can't get session_cookie and so can't construct token. Note that I do not recommend using session_cookie itself as the token because it creates worse problems - when referrer is leaked or if in hidden form field, then if page is saved.