PHP regex for url validation, filter_var is too permisive

前端 未结 3 1506
后悔当初
后悔当初 2020-12-11 21:18

First lets define a \"URL\" according to my requirements.

The only protocols optionally allowed are http:// and https://

then a man

相关标签:
3条回答
  • 2020-12-11 21:24

    As a starting point you can use this one, it's for JS, but it's easy to convert it to work for PHP preg_match.

    /^(https?\://)?(www\.)?([a-z0-9]([a-z0-9]|(\-[a-z0-9]))*\.)+[a-z]+$/i
    

    For PHP should work this one:

    $reg = '@^(https?\://)?(www\.)?([a-z0-9]([a-z0-9]|(\-[a-z0-9]))*\.)+[a-z]+$@i';
    

    This regexp anyway validates only the domain part, but you can work on this or split the url at the 1st slash '/' (after "://") and validate separately the domain part and the rest.

    BTW: It would validate also "http://www.domain.com.com" but this is not an error because a subdomain url could be like: "http://www.subdomain.domain.com" and it's valid! And there is almost no way (or at least no operatively easy way) to validate for proper domain tld with a regex because you would have to write inline into your regex all possible domain tlds ONE BY ONE like this:

    /^(https?\://)?(www\.)?([a-z0-9]([a-z0-9]|(\-[a-z0-9]))*\.)+(com|it|net|uk|de)$/i
    

    (this last one for instance would validate only domain ending with .com/.net/.de/.it/.co.uk). New tlds always come out, so you would have to adjust you regex everytimne a new tld comes out, that's a pain in the neck!

    0 讨论(0)
  • 2020-12-11 21:25

    It may vary but in most of the cases you don't really need to check the validity of any URL.

    If it's a vital information and you trust your user enough to let him give it through a URL, you can trust him enough to give a valid URL.

    If it isn't a vital information, then you just have to check for XSS attempts and display the URL that the user wanted.

    You can add manually a "http://" if you don't detect one to avoid navigation problems.


    I know, I don't give you an alternative as a solution, but maybe the best way to solve performance & validity problems is just to avoid unnecessary checks.

    0 讨论(0)
  • 2020-12-11 21:39

    You could use parse_url to break up the address into its components. While it's explicitly not built to validate a URL, analyzing the resulting components and matching them against your requirements would at least be a start.

    0 讨论(0)
提交回复
热议问题