问题
I'm hoping this is a simple one.
I run a Rails web app where I'm hosting about 100 school websites. The one app handles all the sites, and I have a management interface where we can add and remove schools etc...
I want to add a stat to this interface which is the total disk space used by that school. Each schools files are stored in a seperate directory structure so that's easy to find out. The only problem is I need it to be fast. So the question is what's the fastest way to find this info. If it could be found via a ruby call on the fly that would be great, but I'm open to whatever will work. Ideally I'd like to avoid having to cache and background generate this data (at least at the rails level). :)
回答1:
If you want to go with pure Ruby you can try this code. Although if you're looking for speed I'm sure du
would be faster.
def dir_size(dir_path)
require 'find'
size = 0
Find.find(dir_path) { |f| size += File.size(f) if File.file?(f) }
size
end
dir_size('/tmp/')
回答2:
`du -s "/your/path/here"`.split("\t").first.to_i #returns bytes
回答3:
Have you tried just running du on each directory on demand? On my aging box I can do a du on a 15M directory in ~4ms and a 250M in ~50ms. These both seems reasonable for the task at hand. How large are the directories? Before you try to really optimize this make sure that its really worth your while. YAGNI and all that.
You could always keep track on the upload when they provide you with the file. That way you just need to track the delta as files are added or removed.
来源:https://stackoverflow.com/questions/3632074/what-is-the-fastest-way-to-calculate-disk-usage-per-customer