问题
Just came across to a website that lists all hidden files. I utilized Facebooks directory called: "hashtag/" and the results showed a whole bunch of files from http://www.facebook.com/hashtag/
Here's the website that does this: https://pentest-tools.com/website-vulnerability-scanning/discover-hidden-directories-and-files
So my main question here is, is there any way to protect your site from being scanned by another website showing secret files like: tokens.php, sessions.php, templates/, models/ configs/...etc???
This got me really worried now, just say we are making a website that holds important files and structures and if someone wanted to see what we're holding in that specific folder, is there any kind of way to prevent this from showing from that website or any other sites that does this operation?
I know you can do this by .htaccess, but could you show me an example for preventing scanning on multiple folders?
回答1:
The main option is Indexes
you can turn them off in your .htaccess
with
Options -Indexes
The second option is blocking Deny
Order Deny,Allow
Deny from All
Say you have a folder called inc
that holds some files that you don't want accessible via a url but you want your other PHP scripts to be able to include them you toss this rule in an .htaccess
file in your inc
folder.
Here is a link to a great resource on .htaccess
rules https://github.com/phanan/htaccess
回答2:
One common approach is to put files that are not directly accessed below the web root, so they are inaccessible from the web.
For example, if you have your web root at var/www/html
, place your source code outside html
and it will be impossible to load from the internet. This, however, requires you to use some sort of loading/requiring the files in your framework, but that is pretty common these days. See PSR-4: Autoloading.
The tool you linked to does not have a "magic" way of finding files, as blocking/hiding files with Apache is absolute, see this answer for more .htaccess details. The tool simply tries a series of words, and common file names, and checks for a valid 200 response. Unless you move your files outside the web root, this would be impossible to defend against.
As you are saying, this is possible deny access to directories and files with .htaccess, and also turn off file listing, however you should note that .htaccess settings are inherited from directory to directory, which means that if you block all access at the root level, this will be applied to every single file and directory it contains.
来源:https://stackoverflow.com/questions/35949015/anyway-protecting-your-site-from-external-site-scandir