I have an application that uses external cache for some data (specifically, memcached on another server). There's an option to compress the data with zlib before caching. The question is - which data size makes it worthy to compress? E.g., if we have 10-byte data item, it's probably useless to waste time on compressing/decompressing it. But if we had 10K of data, it may be worth it. The data stored will be mostly ASCII strings.
I know that depends a lot on network speed, CPU speed, data and what not, but are there any guidelines or heuristics? Doesn't have to be perfect but if it can save some cycles it would be great.