4
votes

I have a use case where the method to load my cache's data is a bulk call, but I'm never going to use getAll to get data from the cache. Is there a way to have multiple concurrent gets all block on a single loadAll? I don't want individual gets on different keys to result in multiple calls to the data source.

cache.get(key1); // missing entry, starts refresh
cache.get(key2); // separate thread, already getting reloaded due to key1 miss

I think I'll have to implement my own synchronization after looking into the LocalCache, using something like a local cache in my data accessor that only allows a call through every so many units of time. When a call does go through, update the local copy with a single assignment statement.

Am I missing something from Guava's cache library?

Edit:

I'm considering something like the following. However, it could potentially continue returning stale data while loadAll finishes. I'd prefer everything blocks at load, and only the first request causes loadAll to proceed.

public class DataCacheLoader extends CacheLoader<String, Double>
{
    private final Cache<String, Double> cache;
    private ConcurrentMap<String, Double> currentData;
    private final AtomicBoolean isloading;

    public DataCacheLoader( final Cache<String, Double> cache )
    {
        this.cache = cache;
        isLoading = new AtomicBoolean( false );
    }

    @Override
    public Double load( final String key ) throws Exception
    {
        if ( isLoading.compareAndSet( false, true ) )
        {
            cache.putAll( loadAll( Lists.newArrayList( key ) ) ) );
        }
        return currentData.get( key );
    }

    @Override
    public Map<String, Double> loadAll(Iterable<? extends String> keys) throws Exception
    {
        currentData = source.getAllData();
        return currentData;
    }
}
1
just as an idea: call get(K key, Callable<? extends V> valueLoader) and pass a callable that will call loadAll. - kofemann
Maybe do this with a ForwardingCache? I'd be suspicious of recursive CacheLoaders, though. - Louis Wasserman
I thought about the ForwardingCache approach. However, for the specific use case (this is a generalization), I realized I was caching at the wrong granularity. If I want the whole load to block, cache the whole load. I'll leave the question since I suspect there are use cases where an approach to block everything may make sense. - Steven Hood
I think your solution (edited) solution is on the right track. If you want everything to block then make your AtomicBoolean a volatile boolean and add a ReentrantLock. - fido

1 Answers

10
votes

Here's a solution that should do the trick. The idea is that instead of caching each individual key you cache the whole map with a single fixed key. The one drawback is that you won't be able to expire individual parts of the underlying map (at least not easily), but that may not be a requirement.

class MyCache {
  private static final Object KEY = new Object();
  private final LoadingCache<Object, Map<String, Double>> delegate = 
      new CacheBuilder()
      // configure cache
          .build(new CacheLoader<Object, Map<String, Double>>() {
             public Map<String, Double> load(Object key) {
               return source.load();
             }
          };
  double get(String key) {
    return cache.get(KEY).get(key);
  }
}