1
votes

I am building a log analysis program in MapReduce. For which I am using MaxMind GeoIP data. Now I want to put the GeoIP data into Distributed Cache. I am developing my application in eclipse. Here is what I am doing

Job job = new Job();        
DistributedCache.addCacheFile(new URI(args[3]), job.getConfiguration());

Where args[3] will have the path.

Here I am using it

protected void setup(Context context) {
    try {
        //String dbfile = "GeoIP//GeoIPCountry.dat";

        org.apache.hadoop.conf.Configuration conf =  context.getConfiguration();

        Path[] dbfile = DistributedCache.getLocalCacheFiles(conf); 

        // GEOIP_MEMORY_CACHE - load database into memory, faster
        // performance but uses more memory, Increase the JVM heap Size
        cl = new LookupService(dbfile.toString(), LookupService.GEOIP_MEMORY_CACHE);

    } catch (Exception e) {
        System.err.println("Error opening GeoIP data file.");
        System.err.println(e);
        System.exit(2);
    }
}

But while running I am getting following error

Exception in thread "main" java.lang.Error: Unresolved compilation problem: 
The method addCacheFile(URI, Configuration) in the type DistributedCache is not applicable for the arguments (URI, Configuration)

I am not able to figure out what is wrong. Please help

1
Can you show your imports in the part when you're adding your file to the cache? - Charles Menguy

1 Answers

2
votes

Its picking up the wrong classes:

The method addCacheFile(URI, Configuration) in the type DistributedCache is not applicable for the arguments (URI, Configuration)

check you imports for URI and Configuration classes.

As per the documentation, they should be java.net.URI and org.apache.hadoop.conf.Configuration

I think you might have messed up with javax.security.auth.login.Configuration class from jdk. Thats not supposed to be used here.