Is it okay to have a very long .htaccess file?

前端 未结 5 1546
悲&欢浪女
悲&欢浪女 2020-12-16 02:05

I am giving out URLs to people on a website, that actually point to something ugly (on the same website).

http://www.mydomain.com/cool-URL
actually points to

相关标签:
5条回答
  • 2020-12-16 02:41

    No, it's not OK and will hamper your page-load speed.

    .htaccess files are evaluated on EVERY server request. Even for static images, CSS and JS files. So, you're asking the webserver to parse 1000+ long possibly REGEX lines while executing a request.

    Also, parent directory's .htaccess file is processed for files residing in subdirectory too. So, if your large .htaccess is in root directory of website, then it will be processed for all requests made for files in the subdirectories too (along with the .htaccess file in subdirectory).

    That my friend is a lot of processing. If you have a page with 10 images (for example) it gets processes 11 times. And the more processing in the file, then the more cycles it takes. So yes, anything in a htaccess file has an impact. Now is that impact noticeable? It is hard to say when it becomes an issue. But it would have to be pretty big as the processing in relatively simple, which in your case is.

    The key with an htaccess file is to make it smart. You would not want to list 200 entries. You can so it smart with just a few lines (if you want to use htaccess).

    0 讨论(0)
  • 2020-12-16 02:45

    Having that much rules is really inadvisable. It’s not that they are in a .htaccess file or that they are tested for each request. (Depending on the configuration, that’s might also happen if you put them into your server’s or virtual host configuration.)

    It’s the mere fact that it’s that huge amount of rules that is tested and, depending on the rules, every rule has to be tested until a match is found.

    Well, you could counteract that by arranging the rules in the order of matching probability so that the probability of finding a match early is high. But it’s complexity is still O(n) in the worst case.

    If you really need that much mappings and the mappings are fixed, you can use a RewriteMap hash file that has a complexity of O(1) instead of O(n) for the separated rules. Or you shift the mapping to your PHP application and do it there.

    0 讨论(0)
  • 2020-12-16 02:59

    It should be fine. You just might experience server lag when it gets really long. I would look into mod_rewrite specs, you might be able to automate the forwards with a few line & regex function. I don't know enough about the url variables that will be passed to give you an example.

    0 讨论(0)
  • 2020-12-16 03:01

    It's definitely not OK, as a few people here have mentioned.

    Do you know all the URLs you wish to rewrite beforehand. If so, you can store the rules in some database, iterate thru the rules and pre-generate the actual URLS,and store them in memcache, with key being the good-looking URL and value being the actual URL of the content.

    Then when the request comes, look up the key in memcache, and redirect the user to the real URL. I don't even think you need .htaccess for this.

    0 讨论(0)
  • 2020-12-16 03:06

    I would perform a simple test: generate a large .htaccess file with random URLs, and measure the resulting performance yourself.

    import random,string
    
    def rand_string():
        length = random.randint(4,10)
        res = []
        for i in range(length):
            res.append(random.choice(string.letters))
        return ''.join(res)
    
    for i in range(1000):
        print "RewriteRule %s http://www.mydomain.com/boring.php?%s [R]" % \
         (rand_string(), rand_string())
    
    0 讨论(0)
提交回复
热议问题