I have a game server running in legacy appengine 2.7. I've migrated the server to py3/flask and all the various bits. I've got the new server connected to a Redis instance while the old server is using the memcache for py2.7.
I'm thinking of connecting the 2.7 version of my server to Redis memcache since I've got it running as a staged migration. That way I can split traffic between the py3 server and the py2 server and they will be using the same memcache server as I test etc.. I can have a few beta users talking to the new server and co-exist with the current server.
I've got a version of my py2.7 server talking to redis.. but I'm finding the docs on using Redis a bit confusing for storing ndb models. It appears I can't just populate an ndb model class directly in Redis as I did with the old legacy memcache.
In the migration docs it says you can setup redis as a "global cache" with appengine with the following example:
client = ndb.Client()
global_cache = ndb.RedisCache.from_environment()
with client.context(global_cache=global_cache):
books = Book.query()
for book in books:
print(book.to_dict())
I'm not super clear on what this means. Do I need to structure all my queries this way that I want cached or is there a one time setup and then models will just auto-cache? would the above example pull from the cache automatically if it exists?
Currently in legacy memcache server I collect a bunch of model instances (or games in my case that a particular users is a member of) into a python list and cache them with a key like
cacheKey = userskey.urlsafe()+"_gamesList"
Any time a game for this user is changed in some way (updated, deleted, added..etc) I delete the cacheKey.. the next time that user queries their games list I rebuild the cache. It doesn't look like I can store data like that in Redis and simply. Also since that user interacts with the server may times I'd just want the model that represents that user cached.. every time that user talks to my server I need to recall their user account with the key
I guess I'm just confused about global cache and how this all differs from the legacy memcache. Currently I get close to 90% cache hits with the legacy memcache so it's certainly worth it to try and replicate this with Redis. Maybe even just a pointer to a practical example app would be helpful. Surprisingly searching around I've not found that.