http://www.bigbold.com/snippets/posts/show/3286Rails memcached is not very easy to introduce to a large rails installation. Memcached also chews up a lot of memory on the box and overall cached model does not work the way I needed it to. Basically, I have just a “few” queries that I needed to cache because pagination sucks just that bad in rails.
So, I built my own cache, similar to how I build them in PHP except I am not using disk cache, I am using MySQL itself to cache it’s own results
First, we need a table to hold all this info (note the ‘blob’ field)
CREATE TABLE `cacheditems` (
`id` int(11) NOT NULL auto_increment,
`cachekey` varchar(255) default NULL,
`created` datetime default NULL,
`expires` datetime default NULL,
`content` longblob,
`cachehit` int(11) NOT NULL,
PRIMARY KEY (`id`),
KEY `cacheditems_cachekey_index` (`cachekey`),
KEY `cacheditems_created_index` (`created`),
KEY `cacheditems_expires_index` (`expires`)
)
Then we create a model called “cacheditem” which has the following functions
require 'digest/sha1'
class Cacheditem < ActiveRecord::Base
def self.checkfor(sql)
key = Digest::MD5.hexdigest(Marshal.dump(sql))
logger.info "%%% checking for key #{key}"
#logger.info "%%% checking by sql #{sql[0]}"
Cacheditem.find( :first, :conditions => [ “cachekey = ? AND expires > NOW()”, key] )
end
def self.getcached(sql)
key = Digest::MD5.hexdigest(Marshal.dump(sql))
logger.info “%%% getting by key #{key}”
#logger.info “%%% getting by sql #{sql[0]}”
getc = Cacheditem.find( :first, :conditions => [ “cachekey = ?”, key] )
hitcount = getc.cachehit + 1
Cacheditem.update(getc.id, {:cachehit => hitcount})
Cacheditem.delete_all “expires < NOW()" # cleaner
return Marshal.load( getc.content )
end
def self.storeresult(sql, result)
key = Digest::MD5.hexdigest(Marshal.dump(sql))
logger.info "%%% storing by key #{key}"
content = Marshal.dump(result)
logger.level = (4) # this stops display in logs of the marshal data
ci = new()
ci.cachekey = key
ci.created = Time.now()
ci.expires = 30.minutes.from_now() # change as needed
ci.content = content
ci.cachehit = 0
ci.save
return result
end
end
Then, in application.rb I added the following function
def find_by_sql_cache(sql)
iscached = Cacheditem.checkfor(sql)
if iscached
Cacheditem.getcached(sql)
else
result = connection.select_all(sanitize_sql(sql), "#{name} Load").collect! { |record| instantiate(record) }
Cacheditem.storeresult(sql, result)
end
end
just throw “_cache” after any “find_by_sql” statement you have a need to cache and there you are.
This works very fast, very well, and doesn’t hog your memory. It cleans up after itself in the database, and perhaps it does that too much.. It would be easy to add in a standard garbage collection function which runs on a random but I felt this gave me much better stats of the actual thirty-minute cache…
If you use zabbix for monitoring your network, you can have fun graphs of cache statistics by adding the following to your zabbix_agentd.conf