1
votes

Inside my system I have data with a short lenght of life, it means that the data is still actuall not for a long time but shold be persisted in data store. Also this data may be changed frequently for each user, for instance each minute. Potentially amount of users maybe large enough and I want to speed up the put/get process of this data by usage of memcache and delayed persist to the bigtable.

No problems just to put/get objects by keys. But for some use cases I need to retrieve all data from cache that still alive but api allows me to get data only by keys. Hence I need to have some key holder that knows all keys of the data inside memcache... But any object may be evicted and I need to remove this key from global registry of keys (but such listener doesn't work in GAE). To store all this objects in the list a map is not accaptable for my solution because each object should has it's own time to evict...

Could somebody recommend me in which way I should move?

3

3 Answers

2
votes

It sounds like what you really are attempting to do is have some sort of queue for data that you will be persisting. Memcache is not a good choice for this since as you've said, it is not reliable (nor is it meant to be). Perhaps you would be better off using Task Queues?

0
votes

Memcache isn't designed for exhaustive access, and if you need it, you're probably using it the wrong way. Memcache is a sharded hashtable, and as such really isn't designed to be enumerated.

It's not clear from your description exactly what you're trying to do, but it sounds like at the least you need to restructure your data so you're aware of the expected keys at the time you want to write it to the datastore.

0
votes

Since I am encountering the very same problem, which I might solve by building a decorator function and wrap the evicting function around it so that key to the entity is automatically deleted from key directory/placeholder on memcache, i.e. when you call for eviction.

Something like this:

def decorate_evict_decorator(key_prefix):

    def evict_decorator(evict):         
        def wrapper(self,entity_name_or_id):#use self if the function is bound to a class.
            mem=memcache.Client()
            placeholder=mem.get("placeholder")#could use gets with cas
            #{"placeholder":{key_prefix+"|"+entity_name:key_or_id}}
            evict(self,entity_name_or_id)
            del placeholder[key_prefix+"|"+entity_name]
            mem.set("placeholder",placeholder)
        return wrapper
    return evict_decorator


class car(db.Model):
    car_model=db.StringProperty(required=True)
    company=db.StringProperty(required=True)
    color=db.StringProperty(required=True)
    engine=db.StringProperty()

    @classmethod
    @decorate_evict_decorator("car")
    evict(car_model):
        #delete process

class engine(db.Model):
    model=db.StringProperty(required=True)
    cylinders=db.IntegerProperty(required=True)
    litres=db.FloatProperty(required=True)
    manufacturer=db.StringProperty(required=True)

    @classmethod
    @decorate_evict_decorator("engine")
    evict(engine_model):
        #delete process

You could improve on this according to your data structure and flow. And for more on decorators.

You might want to add a cron to keep your datastore in sync the memcache at a regular interval.