0
votes

I am trying to develop a cache application which will load itself on server start and client application will be able to read the cache data through REST service calls.

Hence, I have to initialize the GemFire cache at the deployment of the application. This would load the data (derived the data from the RDBMS) which is in the form of Map into the Cache.

I have seen a CacheLoader loading one entry at a time into a GemFire Region, but can Region.putAll(map) load all data at a time or are there some other methods?

Please help.

2

2 Answers

0
votes

I believe that the <initializer> element would be a perfect match for your use case here. It's used to Launch an Application after Initializing the Cache, and it can certainly be used to populate the regions upon cache initialization.

There are other options as well, like writing a custom Function and execute it as soon as your startup script returns, but I think that the <initializer> element would be the appropriate here.

Hope this helps. Cheers.

0
votes

I would be very careful about putting all the data from your underlying data source, e.g. from an RDBMS, in memory, in a java.util.Map, since it would be very easy to run out of memory (hence an OutOfMemoryError) pretty quickly depending on the size of the result set.

None-the-less, if you want an example of this see here; configuration is here.

Essentially, I am using a Spring BeanPostProcessor, the RegionPutAllBeanPostProcessor, to put a Map of data into a "target" Region.

For example, I have a Region (i.e. "RegionOne") and I can use the RegionPutAllBeanPostProcessor to target this Region and put data into the Region from the Map.

Obviously, you have many different options when it comes to triggering this Region load/"warming": a GemFire Initializer, a Spring BeanPostProcessor (docs here) or even a Spring ApplicationListener listening for ApplicationContextEvents, such as an on ContextRefreshedEvent (docs here).

However, while the Map in this test is hard-coded in XML, you could envision populating this Map from any data source, including a java.sql.ResultSet derived from a SQL query executed against the RDBMS.

So, perhaps a better approach/solution, that would not eat up as much memory, would be to use a BBP "injected" with Spring's JdbcTemplate or a JPA EntityManager, or even better yet, use Spring Data JPA, and load the data from your framework of choice to put data directly into the Region. After all, if the Region.putAll(:Map) is essentially just iterating the Map.Entries of the incoming Map and calling Region.put(key, value) individually for each Map.Entry (this, this and this) , then clearly it is not buying you that much and certainly does not justify putting all the data in-memory before putting it into the Region.

For instance, most ResultSets are implemented with a DB cursor that allows you to fetch a certain number of rows at once, but not all the possible rows. Clearly, your SQL query can even be more selective about which rows are returned based on interests/pertinence, think of loading only a subset of the most important data, or some other criteria specifiable in the query predicate. Then simply just put the data into the Region when iterating the ResultSet.

Food for thought.

-John