问题
I'm using a combination of Dragonfly and rack/cache hosted on Heroku.
I'm using Dragonfly for uploaded assets. Thumbnails are processed on-the-fly and stored in rack/cache for fast delivery from memcached (via the Memcachier addon).
Regular static assets are also cached in memcached via rack/cache.
My problem is that any uploaded files over 1MB are causing a 500 error in my application.
2013-07-15T10:38:27.040992+00:00 app[web.1]: DalliError: Value too large, memcached can only store 1048576 bytes per key [key: d49c36d5db74ef45e957cf169a0b27b83b9e84de, size: 1502314]
2013-07-15T10:38:27.052255+00:00 app[web.1]: cache: [GET /media/BAhbBlsHOgZmSSIdNTA3Njk3ZWFiODBmNDEwMDEzMDAzNjA4BjoGRVQ/WTW_A5Flyer_HealthcareMedicalObsGynae_WEB.pdf] miss, store
2013-07-15T10:38:27.060583+00:00 app[web.1]: !! Unexpected error while processing request: undefined method `each' for nil:NilClass
Memcache has a limit of 1MB, so I can understand why my asset was not cached, but I would rather it didn't break serving assets.
I'm not even sure where this error is coming from. Presumably from one of the other rack middlewares?
Increasing the maximum file size doesn't seem to have have any affect.
config.cache_store = :dalli_store, ENV["MEMCACHIER_SERVERS"].split(","), {¬
:username => ENV["MEMCACHIER_USERNAME"],¬
:password => ENV["MEMCACHIER_PASSWORD"],¬
:value_max_bytes => 5242880 # 5MB¬
}
Long term, I know that moving this sort of asset off of Heroku is a sensible move, but that won't be a quick job.
What can I do to serve these assets on Heroku in the meantime without errors?
回答1:
I had the same issue as @jordelver and managed to get round memcachier's limits by monkey patching Dragonfly::Response
:
module Dragonfly
class Response
private
def cache_headers
if job.size > 1048576
{
"Cache-Control" => "no-cache, no-store",
"Pragma" => "no-cache"
}
else
{
"Cache-Control" => "public, max-age=31536000", # (1 year)
"ETag" => %("#{job.signature}")
}
end
end
end
end
Essentially, if the size is over 1048576 bytes, send a no-cache header.
回答2:
So contrary to @jordelver's question, I find that setting the :value_max_bytes
option of dalli does work. I'm setting up Rack::Cache in a slightly different way that maybe makes the difference.
This is what my production.rb
contains to configure Rack::Cache:
client = Dalli::Client.new(ENV["MEMCACHIER_SERVERS"],
:username => ENV["MEMCACHIER_USERNAME"],
:password => ENV["MEMCACHIER_PASSWORD"],
:value_max_bytes => 10485760)
config.action_dispatch.rack_cache = {
:metastore => client,
:entitystore => client
}
config.static_cache_control = "public, max-age=2592000"
With the above, some errors will be printed to the logs for values over 1MB, but they won't cause a 5xx error for the client, just a cache miss.
P.S I work for MemCachier :) so we're interested in trying to sort this out. Please let me know if it works.
回答3:
My application.js
was too big for rack-cache
so I did:
# in config/environments/development.rb
config.action_dispatch.rack_cache = {
metastore: 'file:/var/cache/rack/meta',
entitystore: 'file:tmp/cache/rack/body'
}
And it works!
Stores metadata in memcache but actual file in filesystem and not in memory.
来源:https://stackoverflow.com/questions/17656999/how-to-cache-files-over-1mb-with-rack-cache-on-heroku